Jobs
Interviews

12 Snowflake Architecture Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

35 - 40 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Position Summary: We are seeking an exceptionally talented Snowflake Architect to design, build, and optimize advanced data solutions on the Snowflake platform. In this role, you will lead the architecture and implementation of scalable data systems, ensuring high performance, security, and seamless integration with diverse data sources. Responsibilities: Design and implement end-to-end Snowflake-based solutions to meet complex data processing and analytics needs. Optimize data pipelines, storage, and query performance on Snowflake. Collaborate with cross-functional teams to define data strategies and align architecture with business goals. Ensure best practices for security, governance, and compliance within the Snowflake environment. Provide thought leadership on emerging trends in cloud-based data platforms. Requirements: Proven expertise in Snowflake architecture and development, with hands-on experience in Snowpipe, Streams, and Tasks. Strong background in data modeling, ETL processes, and SQL optimization. Familiarity with cloud platforms (AWS, Azure, or GCP) and integration with Snowflake. Excellent problem-solving skills and the ability to work in fast-paced environments. Location:Ahmedabad, Chennai, Kolkata, Pune, Hyderabad, Remote

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As an Enterprise Snowflake L1/L2 AMS Support, your primary responsibilities will include monitoring and supporting Snowflake data warehouse performance, optimizing queries, and overseeing job execution. You will be tasked with troubleshooting data loading failures, managing access control, and addressing role-based security issues. Additionally, you will be expected to carry out patching, software upgrades, and security compliance checks while upholding SLA commitments for query execution and system performance. To excel in this role, you should possess 2-5 years of experience working with Snowflake architecture, SQL scripting, and query optimization. It would be beneficial to have familiarity with ETL tools such as Talend, Matillion, and Alteryx for seamless Snowflake integration.,

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. The ideal candidate should possess a solid background in data architecture, cloud data platforms, and Snowflake implementation, along with practical experience in end-to-end data pipeline and data warehouse design. In this role, you will be responsible for leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will also be tasked with defining data modeling standards, best practices, and governance frameworks. Collaborating with stakeholders to comprehend data requirements and translating them into robust architectural solutions will be a key part of your responsibilities. Furthermore, you will be required to design and optimize ETL/ELT pipelines utilizing tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Implementing data security, privacy, and role-based access controls within Snowflake is also essential. Providing guidance to development teams on performance tuning, query optimization, and cost management within Snowflake will be part of your duties. Additionally, ensuring high availability, fault tolerance, and compliance across data platforms will be crucial. Mentoring developers and junior architects on Snowflake capabilities is an important aspect of this role. Qualifications and Experience: - 8+ years of overall experience in data engineering, BI, or data architecture, with a minimum of 3+ years of hands-on Snowflake experience. - Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. - Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). - Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. - Good understanding of data lakes, data mesh, and modern data stack principles. - Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. - Solid knowledge of data governance, metadata management, and cataloging. Desired Skills: - Snowflake certification (e.g., SnowPro Core/Advanced Architect). - Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. - Knowledge of data visualization tools such as Power BI, Tableau, or Looker. - Experience in healthcare, BFSI, or retail domain projects. Please note that this job description is sourced from hirist.tech.,

Posted 2 weeks ago

Apply

8.0 - 13.0 years

35 - 45 Lacs

Noida, Pune, Bengaluru

Hybrid

Role & responsibilities Implementing data management solutions. Certified Snowflake cloud data warehouse Architect Deep understanding of start and snowflake schema, dimensional modelling. Experience in the design and implementation of data pipelines and ETL processes using Snowflake. Optimize data models for performance and scalability. Collaborate with various technical and business stakeholders to define data requirements. Ensure data quality and governance best practices are followed. Experience with data security and data access controls in Snowflake. Expertise in complex SQL, python scripting, and performance tuning. Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero-copy clone, time travel, and automating them. Experience in handling semi-structured data (JSON, XML), and columnar PARQUET using the VARIANT attribute in Snowflake. Experience in re-clustering the data in Snowflake with a good understanding of Micro-Partitions. Experience in Migration processes to Snowflake from an on-premises database environment. Experience in designing and building manual or auto-ingestion data pipelines using Snowpipe. SnowSQL experience in developing stored procedures and writing queries to analyze and transform data. Must have skills - Certified Snowflake Architect, Snowflake Architecture, Snow Pipes, SnowSQL, SQL, CI/CD and Python Perks and benefits Competitive compensation package. Opportunity to work with industry leaders. Collaborative and innovative work environment. Professional growth and development opportunities.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Notice Period: Immediate About the role: We are hiring a Senior Snowflake Data Engineer with 10+ years of experience in cloud data warehousing and deep expertise on the Snowflake platform. The ideal candidate should have strong skills in SQL, ETL/ELT, data modeling, and performance tuning, along with a solid understanding of Snowflake architecture, security, and cost optimization. Roles & Responsibilities: Collaborate with data engineers, product owners, and QA teams to translate business needs into efficient Snowflake-based data models and pipelines. Design, build, and optimize data solutions leveraging Snowflake features such as virtual warehouses, data sharing, cloning, and time travel. Develop and maintain robust ETL/ELT pipelines using tools like Talend, Snowpipe, Streams, Tasks, and Python. Ensure optimal performance of SQL queries, warehouse sizing, and cost-efficient design strategies. Implement best practices for data quality, security, and governance, including RBAC, network policies, and masking. Contribute to code reviews and development standards to ensure high-quality deliverables. Support analytics and BI teams with data exploration and visualization using tools like Tableau or Power BI. Maintain version control using Git and follow Agile development practices. Required Skills: Snowflake Expertise: Deep knowledge of Snowflake architecture and core features. SQL Development: Advanced proficiency in writing and optimizing complex SQL queries. ETL/ELT: Hands-on experience with ETL/ELT design using Snowflake tools and scripting (Python). Data Modeling: Proficient in dimensional modeling, data vault, and best practices within Snowflake. Automation & Scripting: Python or similar scripting language for data workflows. Cloud Integration: Familiarity with Azure and its services integrated with Snowflake. BI & Visualization: Exposure to Tableau, Power BI or other similar platforms.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

11 - 14 Lacs

Mumbai

Work from Office

About the Role - We are looking for an experienced Snowflake Admin to manage and optimize Snowflake cloud data platforms. The ideal candidate should have strong expertise in Snowflake architecture, performance tuning, security, and administration. This role requires the ability to troubleshoot issues, automate processes, and collaborate with cross-functional teams. Key Responsibilities: Administer and optimize Snowflake environments for performance and security. Manage user roles, permissions, and access controls. Implement best practices for database performance tuning and query optimization. Monitor system performance and troubleshoot issues proactively Work with data engineering teams to support ETL processes and integrations. Automate administrative tasks using SQL and scripting. Required Skills: 5+ years of experience in Snowflake administration. Expertise in Snowflake architecture, data sharing, and workload optimization. Strong knowledge of SQL, Python/Shell scripting for automation. Experience with data security, access management, and governance policies. Understanding of cloud environments (AWS/Azure/GCP) and Snowflake integrations. Contract Duration: 3 Months (C2C)

Posted 1 month ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Kanpur

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 month ago

Apply

7.0 - 10.0 years

8 - 14 Lacs

Hyderabad

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 month ago

Apply

4.0 - 8.0 years

18 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Job Type: C2H (Long Term) Required Skill Set: Core Technical Skills: Snowflake database design, architecture & performance tuning Strong experience in Oracle DB and SQL query optimization Expertise in DDL/DML operations, data replication, and failover handling Knowledge of Time Travel, Cloning, and RBAC (Role-Based Access Control) Experience with dynamic data masking, secure views, and data governance Tools & Reporting: Familiarity with OEM, Tuning Advisor, AWR reports Soft Skills: Strong communication, coordination with app teams Analytical thinking and problem-solving Ability to work independently or collaboratively Additional Experience: Previous role as Apps DBA or similar Exposure to agile methodologies Hands-on with Snowflake admin best practices, load optimization, and secure data sharing.

Posted 2 months ago

Apply

5 - 10 years

0 - 1 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

JD for Snowflake Admin Key Responsibilities: Administer and manage Snowflake environments including user roles, access control, and resource monitoring. Develop, test, and deploy ELT/ETL pipelines using Snowflake SQL and other tools (e.g., Informatica, DBT, Matillion). Monitor query performance and storage utilization; implement performance tuning and optimization strategies. Manage and automate tasks such as warehouse scaling, Snowpipe ingestion, and task scheduling. Work with semi-structured data formats (JSON, XML, Avro, Parquet) using VARIANT and related functions. Set up and manage data sharing, replication, and failover across Snowflake accounts. Implement and manage security best practices including RBAC, masking policies, and object-level permissions. Collaborate with Data Engineers, Architects, and BI teams to support analytics use cases. Required Skills: Strong hands-on experience with Snowflake architecture, SQL, and performance tuning. Experience with Snowflake features such as Streams, Tasks, Time Travel, Cloning, and External Tables. Proficiency in working with SnowSQL and managing CLI-based operations. Knowledge of cloud platforms (AWS / Azure / GCP) and integration with Snowflake. Experience with data ingestion tools and scripting languages (Python, Shell, etc.). Good understanding of CI/CD pipelines and version control (Git). Role & responsibilities Preferred candidate profile

Posted 2 months ago

Apply

8 - 10 years

10 - 16 Lacs

Hyderabad

Work from Office

Required Skills: 8+ years of overall IT experience, with 4+ Snowflake development . Strong experience in Azure Data Platform tools including: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure Functions Proficient in Snowflake architecture , virtual warehouses , Snowpipe , and Streams & Tasks . Solid experience with SQL , Python , and performance tuning techniques. Understanding of data warehousing concepts, dimensional modeling , and metadata management . Experience integrating Snowflake with Power BI , Tableau , or similar BI tools. Familiar with CI/CD tools (Azure DevOps, Git) for pipeline deployments. Key Responsibilities: Design, develop, and optimize Snowflake-based data pipelines and data warehouse solutions . Implement data ingestion processes from diverse sources using Azure Data Factory (ADF) , Data Lake , and Event Hub . Develop scalable ELT/ETL workflows with Snowflake and Azure components. Create and optimize SQL scripts , stored procedures , and user-defined functions within Snowflake. Develop and maintain data models (dimensional and normalized) supporting business needs. Implement role-based access controls , data masking , and governance within Snowflake. Monitor performance and troubleshoot issues in Snowflake and Azure data pipelines. Work closely with business analysts, data scientists, and engineering teams to ensure data quality and delivery. Use DevOps practices for version control, CI/CD, and deployment automation.

Posted 2 months ago

Apply

6 - 10 years

8 - 18 Lacs

Kolhapur, Hyderabad, Chennai

Work from Office

Relevant Exp:5+ Yrs Mandatory Skills: Snowflake architecture, Matillion, SQL, Python, SnowSQL, any cloud Exp Night shift (6 PM to 3 AM) Complete WFO - 5 Days Email Id: anusha@akshayaitsolutions.com Loc: Hyd/ Ban/Chennai/Kolhapur

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies