Jobs
Interviews

197 Snowpipe Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

5 - 15 Lacs

chennai

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

pune

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

bengaluru

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

18 - 30 Lacs

coimbatore

Work from Office

Must have atleast 7+ years of experience in Data warehouse, ETL, BI projects • Must have atleast 5+ years of experience in Snowflake • Expertise in Snowflake architecture is must. • Must have atleast 3+ years of experience and strong hold in Python/PySpark • Must have experience implementing complex stored Procedures and standard DWH and ETL concepts • Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot • Good to have experience with AWS services and creating DevOps templates for various AWS services. • Experience in using Github, Jenkins • Good communication and Analytical skills • Snowflake certification is desirable

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

chennai

Work from Office

Primary Cloud (AWS, Glue, S3, Lambda, IAM, EC2, RDS, Timestream, Etc.), with ETL experience Secondary - Snowflake Knowledge. Location Chennai

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

chennai

Work from Office

Primary Cloud (AWS, Glue, S3, Lambda, IAM, EC2, RDS, Timestream, Etc.), with ETL experience Secondary - Snowflake Knowledge. Location Chennai

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

pune, bengaluru, mumbai (all areas)

Hybrid

We are hiring for Snowflake + ADF Data engineers, below is the required details. Required Skill: Snowflake + Azure Data Factory. Experience Range: 5 to 10 Years. Job Location: Mumbai, Pune, Bengaluru, Chennai. Work Mode: Hybrid. Interview Date: 23rd Aug-25. Interview Mode: Virtual Interested candidates, kindly share your updated resume to gopinath.r@citiustech.com.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

9 - 19 Lacs

gurugram, chennai, bengaluru

Work from Office

Role & responsibilities Design, build, and maintain scalable data pipelines using DBT and Airflow. Develop and optimize SQL queries and data models in Snowflake. Implement ETL/ELT workflows, ensuring data quality, performance, and reliability. Work with Python for data processing, automation, and integration tasks. Handle JSON data structures for data ingestion, transformation, and APIs. Leverage AWS services (e.g., S3, Lambda, Glue, Redshift) for cloud-based data solutions. Ensure compliance with data security and privacy regulations such as GLBA, PCI-DSS, GDPR, CCPA, and CPRA by implementing proper data encryption, access controls, and data retention policies. Collaborate with data analysts, engineers, and business teams to deliver high-quality data products. Preferred candidate profile Strong expertise in SQL, Snowflake, and DBT for data modeling and transformation. Proficiency in Python and Airflow for workflow automation. Experience working with AWS cloud services. Ability to handle JSON data formats and integrate APIs. Understanding of data governance, security, and compliance frameworks related to financial and personal data regulations (GLBA, PCI-DSS, GDPR, CCPA, CPRA). Strong problem-solving skills and experience in optimizing data pipelines

Posted 3 weeks ago

Apply

11.0 - 18.0 years

0 Lacs

chennai, coimbatore, bengaluru

Hybrid

Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / SNOWPIPE/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 18 years Details of the Walk-in Drive: Date: 23rd AUG [Saturday] 2025 Experience 5 year to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies, H-5, Sipcot It Park, Post, Navalur, Siruseri, Tamil Nadu 603103 Work Location: Open (Bangalore/ Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE SNOWPARK & SNOWPIPE SQL Pyspark/Spark Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4-6 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 1 year of experience in query and process data using SNOWPARK python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: 1. Updated resume 2. Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

chennai, coimbatore, bengaluru

Hybrid

Open & Direct Walk-in Drive event | Hexaware technologies SNOWFLAKE & SNOWPARK Data Engineer/Architect in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 - Snowflake/ Snowpark/ SQL & Pyspark Dear Candidate, I hope this email finds you well. We are thrilled to announce an exciting opportunity for talented professionals like yourself to join our team as a Data Engineer/Architect. We are hosting an Open Walk-in Drive in Chennai, Tamilnadu on 23rd AUG [Saturday] 2025 , and we believe your skills in Snowflake/ SNOWPARK / SNOWPIPE/ SQL & Pyspark align perfectly with what we are seeking. Experience Level: 4 years to 18 years Details of the Walk-in Drive: Date: 23rd AUG [Saturday] 2025 Experience 5 year to 15 years Time: 9.30 AM to 4PM Point of Contact: Azhagu Kumaran Mohan/+91-9789518386 Venue: Hexaware Technologies, H-5, Sipcot It Park, Post, Navalur, Siruseri, Tamil Nadu 603103 Work Location: Open (Bangalore/ Pune/ Mumbai/ Noida/ Dehradun/ Chennai/ Coimbatore) Key Skills and Experience: As a Data Engineer, we are looking for candidates who possess expertise in the following: SNOWFLAKE SNOWPARK & SNOWPIPE SQL Pyspark/Spark Roles and Responsibilities: As a part of our dynamic team, you will be responsible for: 4-6 years of Total IT experience on any ETL/Snowflake cloud tool. Min 3 years of experience in Snowflake Min 1 year of experience in query and process data using SNOWPARK python. Strong SQL with experience in using Analytical functions, Materialized views, and Stored Procedures Experience in Data loading features of Snowflake like Stages, Streams, Tasks, and SNOWPIPE. Working knowledge on Processing Semi-Structured data What to Bring: 1. Updated resume 2. Photo ID, Passport size photo How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at AzhaguK@hexaware.com - +91-9789518386 We look forward to meeting you and exploring the potential of having you as a valuable member of our team.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

bengaluru

Work from Office

Educational Requirements MCA,MSc,Bachelor of Engineering,BBA,BCom,BCS Service Line Data & Analytics Unit Responsibilities Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effort for tasks and meet agreed-upon deadlines. Effectively juggle ad-hoc requests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization - Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain* Data Warehousing, Business IntelligencePrecise Work Location Bhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional Requirements: Mandatory skills* SnowflakeDesired skills* Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform->Snowflake Technology->OpenSystem->Python - OpenSystem

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

pune

Remote

Role & responsibilities Snowflake, DBT, AWS and Lead

Posted 3 weeks ago

Apply

9.0 - 12.0 years

5 - 5 Lacs

thiruvananthapuram

Work from Office

Tech Lead - Azure/Snowflake & AWS Migration Key Responsibilities - Design and develop scalable data pipelines using Snowflake as the primary data platform, integrating with tools like Azure Data Factory, Synapse Analytics, and AWS services. - Build robust, efficient SQL and Python-based data transformations for cleansing, enrichment, and integration of large-scale datasets. - Lead migration initiatives from AWS-based data platforms to a Snowflake-centered architecture, including: o Rebuilding AWS Glue pipelines in Azure Data Factory or using Snowflake-native ELT approaches. o Migrating EMR Spark jobs to Snowflake SQL or Python-based pipelines. o Migrating Redshift workloads to Snowflake with schema conversion and performance optimization. o Transitioning S3-based data lakes (Hudi, Hive) to Snowflake external tables via ADLS Gen2 or Azure Blob Storage. o Redirecting Kinesis/MSK streaming data to Azure Event Hubs, followed by ingestion into Snowflake using Streams & Tasks or Snowpipe. - Support database migrations from AWS RDS (Aurora PostgreSQL, MySQL, Oracle) to Snowflake, focusing on schema translation, compatibility handling, and data movement at scale. - Design modern Snowflake lakehouse-style architectures that incorporate raw, staging, and curated zones, with support for time travel, cloning, zero-copy restore, and data sharing. - Integrate Azure Functions or Logic Apps with Snowflake for orchestration and trigger-based automation. - Implement security best practices, including Azure Key Vault integration and Snowflake role-based access control, data masking, and network policies. - Optimize Snowflake performance and costs using clustering, multi-cluster warehouses, materialized views, and result caching. - Support CI/CD processes for Snowflake pipelines using Git, Azure DevOps or GitHub Actions, and SQL code versioning. - Maintain well-documented data engineering workflows, architecture diagrams, and technical documentation to support collaboration and long-term platform maintainability. Required Qualifications - 9+ years of data engineering experience, with 3+ years on Microsoft Azure stack and hands-on Snowflake expertise. - Proficiency in: o Python for scripting and ETL orchestration o SQL for complex data transformation and performance tuning in Snowflake o Azure Data Factory and Synapse Analytics (SQL Pools) - Experience in migrating workloads from AWS to Azure/Snowflake, including services such as Glue, EMR, Redshift, Lambda, Kinesis, S3, and MSK. - Strong understanding of cloud architecture and hybrid data environments across AWS and Azure. - Hands-on experience with database migration, schema conversion, and tuning in PostgreSQL, MySQL, and Oracle RDS. - Familiarity with Azure Event Hubs, Logic Apps, and Key Vault. - Working knowledge of CI/CD, version control (Git), and DevOps principles applied to data engineering workloads. Preferred Qualifications - Extensive experience with Snowflake Streams, Tasks, Snowpipe, external tables, and data sharing. - Exposure to MSK-to-Event Hubs migration and streaming data integration into Snowflake. - Familiarity with Terraform or ARM templates for Infrastructure-as-Code (IaC) in Azure environments. - Certification such as SnowPro Core, Azure Data Engineer Associate, or equivalent. Required Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake

Posted 4 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

At PwC, the focus in data and analytics engineering is on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. You play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will concentrate on designing and building data infrastructure and systems to enable efficient data processing and analysis. Your responsibilities include developing and implementing data pipelines, data integration, and data transformation solutions. As an AWS Architect / Manager at PwC - AC, you will interact with Offshore Manager/Onsite Business Analyst to understand the requirements and will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. Strong experience in AWS cloud technology is required, along with planning and organization skills. You will work as a cloud Architect/lead on an agile team and provide automated cloud solutions, monitoring the systems routinely to ensure that all business goals are met as per the Business requirements. **Position Requirements:** **Must Have:** - Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python / Java - Design scalable data architectures with Snowflake, integrating cloud technologies (AWS, Azure, GCP) and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Experience in load from disparate data sets and translating complex functional and technical requirements into detailed design - Deploying Snowflake features such as data sharing, events, and lake-house patterns - Experience with data security and data access controls and design - Understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Good knowledge of AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Proficient in Lambda and Kappa Architectures - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Knowledge of Big Data frameworks and related technologies with experience in Hadoop and Spark - Strong experience in AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql, and Spark Streaming - Experience in flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules - Understanding of Cloud data migration processes, methods, and project lifecycle - Business/domain knowledge in Financial Services/Healthcare/Consumer Market/Industrial Products/Telecommunication, Media and Technology/Deal advisory along with technical expertise - Experience in leading technical teams, guiding and mentoring team members - Analytical & problem-solving skills - Communication and presentation skills - Understanding of Data Modeling and Data Architecture **Desired Knowledge/Skills:** - Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming - Experience in Big Data ML toolkits like Mahout, SparkML, or H2O - Knowledge in Python - Certification on AWS Architecture desirable - Worked in Offshore/Onsite Engagements - Experience in AWS services like STEP & Lambda - Project Management skills with consulting experience in Complex Program Delivery **Professional And Educational Background:** BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA **Minimum Years Experience Required:** Candidates with 8-12 years of hands-on experience **Additional Application Instructions:** Add here and change text color to black or remove bullet and section title if not applicable.,

Posted 4 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You will be working in a hybrid mode at multiple locations including Bangalore, Chennai, Gurgaon, Pune, and Kolkata. With at least 6 years of experience in IT, you must possess a Bachelor's and/or master's degree in computer science or equivalent field. Your expertise should lie in Snowflake security, Snowflake SQL, and the design and implementation of various Snowflake objects. Practical experience with Snowflake utilities such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors is essential. You should have a deep understanding of Star and Snowflake dimensional modeling and a strong knowledge of Data Management principles. Additionally, familiarity with the Databricks Data & AI platform and Databricks Delta Lake Architecture is required. Hands-on experience in SQL and Spark (PySpark), as well as building ETL/data warehouse transformation processes, will be a significant part of your role. Strong verbal and written communication skills are essential, along with analytical and problem-solving abilities. Attention to detail is paramount in your work. The mandatory skills for this position include proficiency in (Snowflake + ADF + SQL) OR (Snowflake+ SQL).,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,

Posted 1 month ago

Apply

5.0 - 10.0 years

17 - 27 Lacs

Bengaluru

Work from Office

Job Description: Snowflake Data Engineer Location - Bengaluru Snowflake Data Engineer with 510 years of experience in data engineering and analytics, including at least 4+ years of hands-on experience designing and developing data pipelines and solutions on the Snowflake Data Cloud platform. Strong proficiency in Python for data processing and automation is essential. Must Have Skills: Strong experience in Snowflake Data Cloud, including data modeling, performance tuning, and advanced features like Time Travel, Snowpipe, and Data Sharing. Proficient in Python for data processing, scripting, and utility development. Experience in building and optimizing ETL/ELT pipelines using Snowflake and cloud-native tools. Strong SQL skills for data transformation, validation, and analytics. Working knowledge of AWS services such as S3, Glue, Lambda, and Athena. Experience with CI/CD pipelines and version control tools like Git. Ability to troubleshoot and optimize data workflows for performance and reliability. Good to Have Skills: SnowPro Core certification or equivalent data engineering certifications. Exposure to Apache Spark for distributed data processing. Domain: Experience in Telecom domain is preferred, especially with billing systems, CDR processing, and reconciliation workflows. Role & Responsibilities: Design and develop scalable data pipelines and analytics solutions using Snowflake and Python. Collaborate with data architects and analysts to understand requirements and translate them into technical solutions. Implement data ingestion, transformation, and curation workflows using Snowflake and AWS services. Ensure data quality, integrity, and compliance through robust validation and monitoring processes. Participate in performance tuning and optimization of Snowflake queries and pipelines. Support UAT and production deployments, including troubleshooting and issue resolution. Document technical designs, data flows, and operational procedures for internal and client use. Qualifications: Bachelors degree in Computer Science, Information Technology, or related field. Strong communication skills to interact with technical and business stakeholders. Ability to present and defend technical solutions with clarity and confidence. Detail-oriented with a passion for building reliable and efficient data systems. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

delhi

On-site

As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and tools integrate and compare with other platforms such as AWS, Azure Fabric, and Databricks. In this role, you will lead the post-sales technical strategy and execution for high-priority Snowflake use cases across strategic customer accounts. You will also be responsible for triaging and resolving advanced, long-running customer issues while ensuring timely and clear communication. Developing and maintaining robust internal documentation, knowledge bases, and training materials to scale support efficiency will also be a part of your responsibilities. Additionally, you will support with enterprise-scale RFPs focused around Snowflake. To be successful in this role, you should have at least 8 years of industry experience, including a minimum of 3 years in a Snowflake consulting environment. You should possess experience in implementing and operating Snowflake-centric solutions and proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform. An understanding of the complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools is essential. Strong skills in databases, data warehouses, data processing, as well as extensive hands-on expertise with SQL and SQL analytics are required. Familiarity with data science concepts and Python is a strong advantage. Knowledge of Snowflake components such as Snowpipe, Query Parsing and Optimization, Snowpark, Snowflake ML, Authorization and Access control management, Metadata Management, Infrastructure Management & Auto-scaling, Snowflake Marketplace for datasets and applications, as well as DevOps & Orchestration tools like Airflow, dbt, and Jenkins is necessary. Possessing Snowflake certifications would be a good-to-have qualification. Strong communication and presentation skills are essential in this role as you will be required to engage with both technical and executive audiences. Moreover, you should be skilled in working collaboratively across engineering, product, and customer success teams. This position is open in all Xebia office locations including Pune, Bangalore, Gurugram, Hyderabad, Chennai, Bhopal, and Jaipur. If you meet the above requirements and are excited about this opportunity, please share your details here: [Apply Now](https://forms.office.com/e/LNuc2P3RAf),

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL ELT tools like Nifi, Matallion DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPCs Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment evaluate adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional Requirements like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gateson investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errorsetc. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment.

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Chennai

Work from Office

Key Skills: Snowflake development, DBT (CLI & Cloud), ELT pipeline design, SQL scripting, data modeling, GitHub CI/CD integration, Snowpipe, performance tuning, data governance, troubleshooting, and strong communication skills. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines and ELT workflows using Snowflake SQL and DBT. Utilize SnowSQL CLI and Snowpipe for real-time and batch data loading, including the creation of custom functions and stored procedures. Implement Snowflake Task Orchestration and schema modeling, and perform system performance tuning for large-scale data environments. Build, deploy, and manage robust data models within Snowflake to support reporting and analytical solutions. Leverage DBT (CLI and Cloud) to script and manage complex ELT logic, applying best practices for version control using GitHub. Independently design and execute innovative ETL and reporting solutions that align with business and operational goals. Conduct issue triaging, pipeline debugging, and optimization to address data quality and processing gaps. Ensure technical designs adhere to data governance policies, security standards, and non-functional requirements (e.g., reliability, scalability, performance). Provide expert guidance on Snowflake features, optimization, security best practices, and cross-environment data movement strategies. Create and maintain comprehensive documentation for database objects, ETL processes, and data workflows. Collaborate with DevOps teams to implement CI/CD pipelines involving GitHub, DBT, and Snowflake integrations. Troubleshoot post-deployment production issues and deliver timely resolutions. Experience Requirements: 5-8 years of experience in data engineering, with a strong focus on Snowflake and modern data architecture. Hands-on experience with Snowflake's architecture, including SnowSQL, Snowpipe, stored procedures, schema design, and workload optimization. Extensive experience with DBT (CLI and Cloud), including scripting, transformation logic, and integration with GitHub for version control. Successfully built and deployed large-scale ELT pipelines using Snowflake and DBT, optimizing for performance and data quality. Proven track record in troubleshooting complex production data issues and resolving them with minimal downtime. Experience aligning data engineering practices with data governance and compliance standards. Familiarity with CI/CD pipelines in a cloud data environment, including deploying updates to production using GitHub actions and DBT integrations. Strong ability to communicate technical details clearly across teams and stakeholders. Education: Any Post Graduation, Any Graduation.

Posted 1 month ago

Apply

6.0 - 10.0 years

5 - 8 Lacs

Greater Noida

Work from Office

Job Description- • Experience on implementing Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python • Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns • Proficiency in RDBMS, complex SQL, PL/SQL, performance tuning and troubleshoot • Provide resolution to an extensive range of complicated data pipeline related problems • Experience in Data Migration from RDBMS to Snowflake cloud data warehouse • Experience with data security and data access controls and design • Build processes supporting data transformation, data structures, metadata, dependency & workload management • Experience in Snowflake modelling - roles, schema, databases. • Extensive hands-on expertise with Creation of Stored Procedures and Advance SQL. • Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into DBT models. • Develop and enforce best practices for version control, testing, and documentation of DBT models. • Build and manage data quality checks and validation processes within the DBT pipelines. • Ability to optimize SQL queries for performance and efficiency. • Good to have experience in Azure services such as ADF, Databricks, Data pipeline building. • Excellent analytical and problem-solving skills. • Have working experience in an Agile methodology. • Knowledge of DevOps processes (including CI/CD) , PowerBI • Excellent communication skills.

Posted 1 month ago

Apply

5.0 - 10.0 years

17 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

Job Title: Snowflake Developer Location: Hyderabad or Bangalore. ( Hybrid Working ) Experience: 5+ Years Responsibilities: Use data mappings and models provided by the data modeling team to build robust pipelines in Snowflake . Design and implement data pipelines with proper 2NF/3NF normalization standards. Develop and maintain ETL processes for integrating data from multiple ERP and source systems. Create scalable and secure data architecture in Snowflake that supports DQ needs. Raise CAB requests through Carriers change process and manage deployment to production. Provide UAT support and transition finalized pipelines to support teams. Document all technical artifacts for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams for seamless DQ integration. Optimize queries, manage performance tuning, and ensure best practices in data operations. Requirements: Strong hands-on experience with Snowflake . Expert-level SQL and experience with data transformation . Familiarity with data architecture and normalization techniques (2NF/3NF). Experience with cloud-based data platforms and pipeline design. Prior experience with AWS data services (e.g., S3, Glue, Lambda, Step Functions) is a strong advantage. Experience with ETL tools and working in agile delivery environments. Understanding of Carrier CAB process or similar structured deployment workflows. Ability to debug complex issues and optimize pipelines for scalability. Strong communication and collaboration skills

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Chennai

Hybrid

Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Snowflake . Write complex SQL queries for data extraction, transformation, and analysis. Optimize performance of Snowflake data models and queries. Implement data warehouse solutions and integrate data from multiple sources. Create and manage Snowflake objects such as tables, views, schemas, stages, and file formats . Monitor and manage Snowflake compute resources and storage usage . Collaborate with data analysts, engineers, and business teams to understand data requirements. Ensure data quality, integrity, and security across all layers. Participate in code reviews and follow Snowflake best practices. Required Skills: 7+ years of experience as a Data Engineer or Snowflake Developer. Strong hands-on experience with Snowflake cloud data platform. Hands-on experience with Matillion ETL Expert-level knowledge of SQL (joins, subqueries, CTEs, window functions, performance tuning). Proficient in data modeling and warehousing concepts (star/snowflake schema, normalization, etc.). Experience with ETL tools (e.g., Informatica, Talend, Matillion, or custom scripts). Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with version control tools (e.g., Git). Good understanding of data governance and data security best practices.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Hybrid

Job Title: Snowflake Developer Experience: 5+ years Location: Hyderabad (Hybrid) Job Type: Full-time About Us: We're seeking an experienced Snowflake Developer to join our team in Pune and Hyderabad. As a Snowflake Developer, you will be responsible for designing, developing, and implementing data warehousing solutions using Snowflake. You will work closely with cross-functional teams to ensure seamless data integration and analytics. Key Responsibilities: Design, develop, and deploy Snowflake-based data warehousing solutions Collaborate with stakeholders to understand data requirements and develop data models Optimize Snowflake performance, scalability, and security Develop and maintain Snowflake SQL scripts, stored procedures, and user-defined functions Troubleshoot data integration and analytics issues Ensure data quality, integrity, and compliance with organizational standards Work with data engineers, analysts, and scientists to ensure seamless data integration and analytics Stay up-to-date with Snowflake features and best practices Requirements: 5+ years of experience in Snowflake development and administration Strong expertise in Snowflake architecture, data modeling, and SQL Experience with data integration tools (e.g., Informatica, Talend, Informatica PowerCenter) Proficiency in Snowflake security features and access control Strong analytical and problem-solving skills Excellent communication and collaboration skills Experience working in hybrid or remote teams Bachelor's degree in Computer Science, Engineering, or related field Nice to Have: Experience with cloud platforms (AWS, Azure, GCP) Knowledge of data governance and data quality frameworks Experience with ETL/ELT tools (e.g., Informatica PowerCenter, Talend, Microsoft SSIS) Familiarity with data visualization tools (e.g., Tableau, Power BI) Experience working with agile methodologies

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies