Jobs
Interviews

91 Snowpipe Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,

Posted 2 days ago

Apply

5.0 - 10.0 years

17 - 27 Lacs

Bengaluru

Work from Office

Job Description: Snowflake Data Engineer Location - Bengaluru Snowflake Data Engineer with 510 years of experience in data engineering and analytics, including at least 4+ years of hands-on experience designing and developing data pipelines and solutions on the Snowflake Data Cloud platform. Strong proficiency in Python for data processing and automation is essential. Must Have Skills: Strong experience in Snowflake Data Cloud, including data modeling, performance tuning, and advanced features like Time Travel, Snowpipe, and Data Sharing. Proficient in Python for data processing, scripting, and utility development. Experience in building and optimizing ETL/ELT pipelines using Snowflake and cloud-native tools. Strong SQL skills for data transformation, validation, and analytics. Working knowledge of AWS services such as S3, Glue, Lambda, and Athena. Experience with CI/CD pipelines and version control tools like Git. Ability to troubleshoot and optimize data workflows for performance and reliability. Good to Have Skills: SnowPro Core certification or equivalent data engineering certifications. Exposure to Apache Spark for distributed data processing. Domain: Experience in Telecom domain is preferred, especially with billing systems, CDR processing, and reconciliation workflows. Role & Responsibilities: Design and develop scalable data pipelines and analytics solutions using Snowflake and Python. Collaborate with data architects and analysts to understand requirements and translate them into technical solutions. Implement data ingestion, transformation, and curation workflows using Snowflake and AWS services. Ensure data quality, integrity, and compliance through robust validation and monitoring processes. Participate in performance tuning and optimization of Snowflake queries and pipelines. Support UAT and production deployments, including troubleshooting and issue resolution. Document technical designs, data flows, and operational procedures for internal and client use. Qualifications: Bachelors degree in Computer Science, Information Technology, or related field. Strong communication skills to interact with technical and business stakeholders. Ability to present and defend technical solutions with clarity and confidence. Detail-oriented with a passion for building reliable and efficient data systems. Role & responsibilities Preferred candidate profile

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

delhi

On-site

As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and tools integrate and compare with other platforms such as AWS, Azure Fabric, and Databricks. In this role, you will lead the post-sales technical strategy and execution for high-priority Snowflake use cases across strategic customer accounts. You will also be responsible for triaging and resolving advanced, long-running customer issues while ensuring timely and clear communication. Developing and maintaining robust internal documentation, knowledge bases, and training materials to scale support efficiency will also be a part of your responsibilities. Additionally, you will support with enterprise-scale RFPs focused around Snowflake. To be successful in this role, you should have at least 8 years of industry experience, including a minimum of 3 years in a Snowflake consulting environment. You should possess experience in implementing and operating Snowflake-centric solutions and proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform. An understanding of the complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools is essential. Strong skills in databases, data warehouses, data processing, as well as extensive hands-on expertise with SQL and SQL analytics are required. Familiarity with data science concepts and Python is a strong advantage. Knowledge of Snowflake components such as Snowpipe, Query Parsing and Optimization, Snowpark, Snowflake ML, Authorization and Access control management, Metadata Management, Infrastructure Management & Auto-scaling, Snowflake Marketplace for datasets and applications, as well as DevOps & Orchestration tools like Airflow, dbt, and Jenkins is necessary. Possessing Snowflake certifications would be a good-to-have qualification. Strong communication and presentation skills are essential in this role as you will be required to engage with both technical and executive audiences. Moreover, you should be skilled in working collaboratively across engineering, product, and customer success teams. This position is open in all Xebia office locations including Pune, Bangalore, Gurugram, Hyderabad, Chennai, Bhopal, and Jaipur. If you meet the above requirements and are excited about this opportunity, please share your details here: [Apply Now](https://forms.office.com/e/LNuc2P3RAf),

Posted 5 days ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL ELT tools like Nifi, Matallion DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPCs Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment evaluate adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional Requirements like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gateson investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errorsetc. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment.

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Chennai

Work from Office

Key Skills: Snowflake development, DBT (CLI & Cloud), ELT pipeline design, SQL scripting, data modeling, GitHub CI/CD integration, Snowpipe, performance tuning, data governance, troubleshooting, and strong communication skills. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines and ELT workflows using Snowflake SQL and DBT. Utilize SnowSQL CLI and Snowpipe for real-time and batch data loading, including the creation of custom functions and stored procedures. Implement Snowflake Task Orchestration and schema modeling, and perform system performance tuning for large-scale data environments. Build, deploy, and manage robust data models within Snowflake to support reporting and analytical solutions. Leverage DBT (CLI and Cloud) to script and manage complex ELT logic, applying best practices for version control using GitHub. Independently design and execute innovative ETL and reporting solutions that align with business and operational goals. Conduct issue triaging, pipeline debugging, and optimization to address data quality and processing gaps. Ensure technical designs adhere to data governance policies, security standards, and non-functional requirements (e.g., reliability, scalability, performance). Provide expert guidance on Snowflake features, optimization, security best practices, and cross-environment data movement strategies. Create and maintain comprehensive documentation for database objects, ETL processes, and data workflows. Collaborate with DevOps teams to implement CI/CD pipelines involving GitHub, DBT, and Snowflake integrations. Troubleshoot post-deployment production issues and deliver timely resolutions. Experience Requirements: 5-8 years of experience in data engineering, with a strong focus on Snowflake and modern data architecture. Hands-on experience with Snowflake's architecture, including SnowSQL, Snowpipe, stored procedures, schema design, and workload optimization. Extensive experience with DBT (CLI and Cloud), including scripting, transformation logic, and integration with GitHub for version control. Successfully built and deployed large-scale ELT pipelines using Snowflake and DBT, optimizing for performance and data quality. Proven track record in troubleshooting complex production data issues and resolving them with minimal downtime. Experience aligning data engineering practices with data governance and compliance standards. Familiarity with CI/CD pipelines in a cloud data environment, including deploying updates to production using GitHub actions and DBT integrations. Strong ability to communicate technical details clearly across teams and stakeholders. Education: Any Post Graduation, Any Graduation.

Posted 1 week ago

Apply

6.0 - 10.0 years

5 - 8 Lacs

Greater Noida

Work from Office

Job Description- • Experience on implementing Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python • Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns • Proficiency in RDBMS, complex SQL, PL/SQL, performance tuning and troubleshoot • Provide resolution to an extensive range of complicated data pipeline related problems • Experience in Data Migration from RDBMS to Snowflake cloud data warehouse • Experience with data security and data access controls and design • Build processes supporting data transformation, data structures, metadata, dependency & workload management • Experience in Snowflake modelling - roles, schema, databases. • Extensive hands-on expertise with Creation of Stored Procedures and Advance SQL. • Collaborate with data engineers, analysts, and stakeholders to understand data requirements and translate them into DBT models. • Develop and enforce best practices for version control, testing, and documentation of DBT models. • Build and manage data quality checks and validation processes within the DBT pipelines. • Ability to optimize SQL queries for performance and efficiency. • Good to have experience in Azure services such as ADF, Databricks, Data pipeline building. • Excellent analytical and problem-solving skills. • Have working experience in an Agile methodology. • Knowledge of DevOps processes (including CI/CD) , PowerBI • Excellent communication skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

17 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

Job Title: Snowflake Developer Location: Hyderabad or Bangalore. ( Hybrid Working ) Experience: 5+ Years Responsibilities: Use data mappings and models provided by the data modeling team to build robust pipelines in Snowflake . Design and implement data pipelines with proper 2NF/3NF normalization standards. Develop and maintain ETL processes for integrating data from multiple ERP and source systems. Create scalable and secure data architecture in Snowflake that supports DQ needs. Raise CAB requests through Carriers change process and manage deployment to production. Provide UAT support and transition finalized pipelines to support teams. Document all technical artifacts for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams for seamless DQ integration. Optimize queries, manage performance tuning, and ensure best practices in data operations. Requirements: Strong hands-on experience with Snowflake . Expert-level SQL and experience with data transformation . Familiarity with data architecture and normalization techniques (2NF/3NF). Experience with cloud-based data platforms and pipeline design. Prior experience with AWS data services (e.g., S3, Glue, Lambda, Step Functions) is a strong advantage. Experience with ETL tools and working in agile delivery environments. Understanding of Carrier CAB process or similar structured deployment workflows. Ability to debug complex issues and optimize pipelines for scalability. Strong communication and collaboration skills

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Hyderabad, Chennai

Hybrid

Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Snowflake . Write complex SQL queries for data extraction, transformation, and analysis. Optimize performance of Snowflake data models and queries. Implement data warehouse solutions and integrate data from multiple sources. Create and manage Snowflake objects such as tables, views, schemas, stages, and file formats . Monitor and manage Snowflake compute resources and storage usage . Collaborate with data analysts, engineers, and business teams to understand data requirements. Ensure data quality, integrity, and security across all layers. Participate in code reviews and follow Snowflake best practices. Required Skills: 7+ years of experience as a Data Engineer or Snowflake Developer. Strong hands-on experience with Snowflake cloud data platform. Hands-on experience with Matillion ETL Expert-level knowledge of SQL (joins, subqueries, CTEs, window functions, performance tuning). Proficient in data modeling and warehousing concepts (star/snowflake schema, normalization, etc.). Experience with ETL tools (e.g., Informatica, Talend, Matillion, or custom scripts). Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with version control tools (e.g., Git). Good understanding of data governance and data security best practices.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Hybrid

Job Title: Snowflake Developer Experience: 5+ years Location: Hyderabad (Hybrid) Job Type: Full-time About Us: We're seeking an experienced Snowflake Developer to join our team in Pune and Hyderabad. As a Snowflake Developer, you will be responsible for designing, developing, and implementing data warehousing solutions using Snowflake. You will work closely with cross-functional teams to ensure seamless data integration and analytics. Key Responsibilities: Design, develop, and deploy Snowflake-based data warehousing solutions Collaborate with stakeholders to understand data requirements and develop data models Optimize Snowflake performance, scalability, and security Develop and maintain Snowflake SQL scripts, stored procedures, and user-defined functions Troubleshoot data integration and analytics issues Ensure data quality, integrity, and compliance with organizational standards Work with data engineers, analysts, and scientists to ensure seamless data integration and analytics Stay up-to-date with Snowflake features and best practices Requirements: 5+ years of experience in Snowflake development and administration Strong expertise in Snowflake architecture, data modeling, and SQL Experience with data integration tools (e.g., Informatica, Talend, Informatica PowerCenter) Proficiency in Snowflake security features and access control Strong analytical and problem-solving skills Excellent communication and collaboration skills Experience working in hybrid or remote teams Bachelor's degree in Computer Science, Engineering, or related field Nice to Have: Experience with cloud platforms (AWS, Azure, GCP) Knowledge of data governance and data quality frameworks Experience with ETL/ELT tools (e.g., Informatica PowerCenter, Talend, Microsoft SSIS) Familiarity with data visualization tools (e.g., Tableau, Power BI) Experience working with agile methodologies

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Snowflake Developer 5+ years Bangalore/Hyderabad (Hybrid) 1 year Contractual (Extendable) Required Candidate profile Aws Data Services e.g., S3 ,Glue , lambda ( Aws Glue is must ) snowpipe aleast 2-3 core projects in Snowflake Development (SQL Coding has to be strong) NumPy or Pandas

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 Snowflake Data Engineer Overall Experience 5+ years of experience in Snowflake and Python. Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Create and maintain optimal data pipeline architecture. Good knowledge of cloud platforms like AWS/Azure/GCP Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (Snow pipe & others), time travel and data sharing and other Snowflake capabilities Good knowledge of Python/Py Spark, advanced features of Python Support business development efforts (proposals and client presentations). Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Mumbai, Mumbai (All Areas)

Work from Office

We are seeking a highly skilled Data Engineer with a strong background in Snowflake and Azure Data Factory (ADF) , and solid experience in Python and SQL . The ideal candidate will play a critical role in designing and building robust, scalable data pipelines, enabling modern cloud-based data platforms including data warehouses and data lakes . Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Snowflake , ADF , and Python to support data warehouse and datalake architectures. Build and automate data ingestion pipelines from various structured and semi-structured sources (APIs, flat files, cloud storage, databases) into Snowflake-based data lakes and data warehouses . Perform full-cycle data migration from on-premise and cloud databases (e.g., Oracle, SQL Server, Redshift, MySQL) to Snowflake . Optimize Snowflake workloads: schema design, clustering, partitioning, materialized views, and query performance tuning . Develop and orchestrate data workflows using Azure Data Factory pipelines, triggers, and dataflows. Implement data quality checks , validation processes, and monitoring mechanisms for production pipelines. Collaborate with cross-functional teams including Data Scientists, Analysts, and DevOps to support diverse data needs. Ensure data integrity, security, and governance throughout the data lifecycle. Maintain comprehensive documentation on pipeline design, schema changes, and architectural decisions. Required Skills & Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. 2+ years of hands-on experience with Snowflake , including Snowflake SQL,SnowSQL, Snowpipe, Streams, Tasks, and performance optimization. 1+ year of experience with Azure Data Factory (ADF) – pipeline design,linked services, datasets, triggers, and integration runtime. Strong Python skills for scripting, automation, and data manipulation. Advanced SQL skills – ability to write efficient, complex queries, procedures, and analytical expressions. Experience designing and implementing data lakes and data warehouses on cloud platforms. Familiarity with Azure cloud services , including Azure Data Lake Storage (ADLS), Blob Storage, Azure SQL, and Azure DevOps. Experience with orchestration tools such as Airflow, DBT, or Prefect is a plus. Understanding of data modeling , data warehousing principles , and ETL/ELT best practices . Experience in building scalable data architectures for analytics and business intelligence use cases. Preferred Qualifications (Nice to Have) Experience with CI/CD pipelines for data engineering (e.g., Azure DevOps, GitHub Actions). Familiarity with Delta Lake, Parquet, or other big data formats. Knowledge of data security and governance tools like Purview or Informatica.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Noida, Hyderabad, Pune

Work from Office

Minimum 6 years of experience in Snowflake development, including data modeling, performance tuning, and ELT pipelines. Strong proficiency in writing complex SQL queries, Snowflake procedures, and working with cloud data platforms.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

1 - 4 Lacs

Chennai

Hybrid

Job Title:Snowflake Developer Experience: 6-8 Years Location:Chennai - Hybrid Job Description : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure, GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Pune, Chennai, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 - 1 Lacs

Gurugram, Bengaluru

Hybrid

Job Responsibilities: Exp : 4-9 Year Snowflake SnowPro Certified professionals are First priority. (Mandatory) -At least 4 years of experience with 4+ years on Snowflake Data Cloud. Must-Have Skills: Snowflake Cloud Platform strong hands-on experience ETL/ELT Tools – experience with one or more tools such as: Azure Data Factory AWS Glue Informatica Talend Qlik Replicate Workflow Orchestration – proficiency with tools like: Apache Airflow Control-M Tidal Automation Programming: Advanced SQL Python (including working with dataframes using Pandas, PySpark, or Snowpark) Data Engineering Concepts: Strong knowledge of data pipelines, data wrangling, and optimization Good-to-Have Skills: SQL scripting and procedural logic Data modeling tools (e.g., Erwin, dbt) Integration tools like Fivetran, Stitch. Please Share your updated your resume at poonampal@kpmg.com

Posted 2 weeks ago

Apply

2.0 - 5.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design and implement scalable data models using Snowflake to support business intelligence and analytics solutions. Implement ETL/ELT solutions that involve complex business transformations. Handle end-to-end Data warehousing solutions Migrate the data from legacy systems to Snowflake systems Write complex SQL queries for extracting, transforming, and loading data, ensuring high performance and accuracy. Optimize the SnowSQL queries for better processing speeds Integrate Snowflake with 3rd party applications Use any ETL/ELT technology Implement data security policies, including user access control and data masking, to maintain compliance with organizational standards. Document solutions and data flows. Skills & Qualifications: Experience: 2+ years of experience in data engineering, with a focus on Snowflake. Proficient in SQL and Snowflake-specific SQL functions . Experience with ETL/ELT tools and cloud data integrations. Technical Skills: Strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpark, Snowpipe, Streamlit Experience in using Dynamic tables is good to have Familiarity with cloud platforms (AWS, Azure, or GCP) and other cloud-based data technologies. Experience with data modeling concepts like star schema, snowflake schema, and data partitioning. Experience with Snowflakes Time Travel, Streams, and Tasks features Experience in data pipeline orchestration. Knowledge of Python or Java for scripting and automation. Knowledge of Snowflake pipelines is good to have Knowledge of data governance practices, including security, compliance, and data lineage.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data Ensure data quality in a big data environment Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach Desirable to have Talend / Snowflake Certification Excellent SQL coding skills, excellent communication, and documentation skills Familiar with Agile delivery process. Must be analytical, creative, and self-motivate Work effectively within a global team environment Excellent communication skills.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad

Work from Office

7+ years of experience as a Data Engineer or Snowflake Developer. Expert-level knowledge of SQL (joins, subqueries, CTEs). Experience with ETL tools (e.g., Informatica, Talend, Matillion). Experience with cloud platforms like AWS, Azure, or GCP.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

30 - 37 Lacs

Hyderabad

Hybrid

SQL & Database Management : Deep knowledge of relational databases (SQL / PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Imp :-The candidate should have a strong data engineering background with hands-on experience in handling large volumes of data, data pipelines, and cloud-based data systems the same it should reflect in profile

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

punjab

On-site

About Us We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today! JOB DESCRIPTION/ RESPONSIBILITIES: We are looking for Data Engineers:- Candidate must have a minimum of (10) years of experience in a Data Engineer role, including the following tools/technologies: Experience with relational (SQL) databases. Experience with data warehouses like Oracle, SQL & Snowflake. Technical expertise in data modeling, data mining and segmentation techniques. Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration (PDI),ADF (Azure Data Factory),Snowpipe,Fivetran,DBT Experience with batch and real time data ingestion and processing frameworks. Experience with languages like Python, Java, etc. Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus. * Develops code and solutions that transfers/transforms data across various systems * Maintains deep technical knowledge of various tools in the data warehouse, data hub, * and analytical tools. * Ensures data is transformed and stored in efficient methods for retrieval and use. * Maintains data systems to ensure optimal performance * Develops a deep understanding of underlying business systems involved with analytical systems. * Follows standard software development lifecycle, code control, code standards and * process standards. * Maintains and develops technical knowledge by self-training of current toolsets and * computing environments, participates in educational opportunities, maintains professional * networks, and participates in professional organizations related to their tech skills Systems Analysis * Works with key stakeholders to understand business needs and capture functional and * technical requirements. * Offers ideas that simplify the design and complexity of solutions delivered. * Effectively communicates any expectations required of stakeholders or other resources * during solution delivery. * Develops and executes test plans to ensure successful rollout of solution, including accuracy * and quality of data. Service Management * Effectively communicates to leaders and stakeholders any obstacles that occur during * solution delivery. * Defines and manages promised delivery dates. * Pro-actively research, analyzes, and predicts operational issues, informing leadership * where appropriate. * Offers viable options to solve unexpected/unknown issues that occur during * solution development and delivery. * EDUCATION / JOB-RELATED TECHNICAL SKILLS: * Bachelors Degree in Computer Science/Information Technology or equivalent * Ability to effectively communicate with others at all levels of the Company both verbally and in * writing. Demonstrates a courteous, tactful, and professional approach with employees and * others. * Ability to work in a large, global corporate structure Our Commitment to Our People Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. Thats why everything we do is geared toward a sustainable futurefor our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial. Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally. Flexible and competitive benefits plans offer the right options to meet your individual/family needs . We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. Together, we have the opportunity and the power to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team! Our Commitment to Diversity, Equity & Inclusion At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise . Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. Equal Opportunity Employer ,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a candidate for this role, you should possess knowledge and experience in Migration projects, with a certification in Snowflake. Proficiency in programming languages such as Snowflake SnowSQL and Snowpipe is essential for this position. Additionally, you should have hands-on experience in working with semi-structured XML and JSON data within the Snowflake environment. Familiarity with Snowflake components like Stages, Streams, Tasks, and External Tables is a requirement for this role. While a working knowledge of Python scripting would be beneficial, experience in Database Design, Database Modeling, and Schema creation is crucial for success in this position.,

Posted 3 weeks ago

Apply

6.0 - 11.0 years

7 - 17 Lacs

Gurugram

Work from Office

We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.

Posted 3 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies