Job
Description
Req ID: 341053
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Snowflake Data Platform - Architect About The Role
Role Summary
As a Snowflake Architect , you will lead the design and implementation of enterprise-grade data platforms using Snowflake. You will define architectural standards, guide engineering teams, and collaborate with stakeholders to align data strategies with business goals. Your role will focus on leveraging Snowflakes advanced capabilities to build scalable, secure, and high-performance data solutions that support analytics, reporting, and data science initiatives.
Key Responsibilities Define and own the end-to-end architecture of data platforms built on Snowflake, including ingestion, transformation, storage, and consumption layers. Design and implement data lakehouse architectures using Snowflakes native features such as Streams, Tasks, Materialized Views, and External Tables. Establish best practices for data governance, security, and access control using Role-Based Access Control (RBAC), Secure Views, and Row-Level Security. Architect scalable ETL/ELT pipelines using Snowflake, DBT, and Python, integrating with orchestration tools like Airflow etx. Lead integration of Snowflake with enterprise systems such as data catalogs, data quality frameworks, monitoring tools, and BI platforms. Working knowledge of AI integrations with Snowflake, including Snowflake Cortex and LLM-powered analytics , with experience in enabling intelligent data applications. Hands-on experience with Snowflake Horizon, including governance capabilities such as object tagging, data lineage, Data Clean Rooms, and AI-powered object insights for unified compliance, security, and discovery. Guide teams in implementing CI/CD pipelines for Snowflake deployments using tools like GitHub Actions, Azure DevOps, or GitLab CI. Monitor and optimize Snowflake performance using Query Profile, Resource Monitors, Clustering Keys, and Auto-Suspend/Resume features. Collaborate with data analysts, engineers, and business stakeholders to ensure platform reliability and data accessibility. Conduct architectural reviews, code audits, and mentoring sessions to ensure adherence to standards and scalability. Stay current with Snowflake innovations and advocate for adoption of new features and capabilities.
Skills / Qualifications Bachelors degree in Computer Science, Software Engineering, Information Technology, or related field required. At least 10 years of experience in data development and solutions in highly complex data environments with large data volumes. At least 10 years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. At least 7 years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy etc. Familiarity with Snowflakes architecture, including virtual warehouses, data storage layers, and compute scaling. Hands on experience with DBT preferred. Experience with performance tuning SQL queries, Spark job, and stored procedures. An understanding of E-R data models (conceptual, logical, and physical). Understanding of advanced data warehouse concepts is required Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions. Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels. Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.
#GenAINTT