Home
Jobs

1123 Snowflake Jobs - Page 25

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of new applications- Conduct code reviews and provide technical guidance to team members- Stay updated on industry trends and best practices to enhance application development processes Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management systems- Hands-on experience in cloud-based data warehousing solutions Additional Information:- The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in designing and implementing data warehouse solutions- Knowledge of ETL processes and data modeling- Hands-on experience with SQL and database management Additional Information:- The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Core Banking, PySpark Good to have skills : AWS BigDataMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with data governance and data quality frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and implement best practices for data management and governance.- Monitor and optimize data pipelines to ensure efficiency and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud data warehousing solutions and architecture.- Knowledge of SQL and data querying languages. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Core Banking, PySpark Good to have skills : AWS BigDataMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Your role will also include troubleshooting data issues and optimizing data workflows to enhance performance and reliability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, Core Banking, PySpark Good to have skills : AWS BigDataMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data processing workflow. Your role will be pivotal in enhancing the efficiency and reliability of data operations within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processing workflows to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake – preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 weeks ago

Apply

10.0 - 14.0 years

12 - 16 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Primary Skills: Minimum 10+ years of overall IT experience, with strong recent hands-on experience on Workato Proven track record of designing and implementing integrations using Workatos low-code platform Good to have experience in Salesforce (SFDC), Snowflake, and Oracle applications Strong understanding of REST APIs, Webhooks, and workflow automation Ability to develop, test, and deploy recipes, handle error processing, and build scalable integrations Boomi Integration Architect JD Key Responsibilities: Design and architect integration solutions using Boomi AtomSphere Strong hands-on experience with Boomi B2B/EDI (X12, RN) integrations Proficiency in Boomi API management, including development and support Guide development teams and ensure adherence to best practices, performance, and security standards Create and manage reusable integration patterns, components, and frameworks Lead end-to-end integration lifecycle, from design to deployment and maintenance Ensure data integrity, compliance, and high availability across hybrid cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi teams Address complex integration challenges and serve as a technical escalation point Nice to Have (Both Roles): Understanding of CI/CD processes Experience with cloud infrastructure environments Familiarity with data governance and privacy regulations Share Your Resume With the Following Details: Current CTC: Expected CTC: Total Experience: Relevant Workato/Boomi Experience: Preferred Location: Location-PAN India (Remote or Hybrid options based on project need) Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad Contact: +91 9032956160

Posted 2 weeks ago

Apply

4.0 - 6.0 years

7 - 14 Lacs

Udaipur, Kolkata, Jaipur

Hybrid

Naukri logo

Senior Data Engineer Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur,Kolkata Job Description: We are looking for a highly skilled and experienced Data Engineer with 46 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4-6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm

Posted 2 weeks ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices. 2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake. 3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.. 4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows. 5. Apply dbt best practices: modular SQL development, testing, documentation, and version control. 6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design. 7. Apply CI/CD and Git-based workflows for version-controlled deployments. 8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks. 9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets. 10. Write well-documented, maintainable code using Git for version control and CI/CD processes. 11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. 12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Required Qualifications 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT. Experience building and deploying DBT models in a production environment. Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred). Familiarity with data quality and validation techniques: dbt tests, dbt docs etc. Experience with Git, CI/CD, and deployment workflows in a team setting Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory. Core Competencies: o Data Engineering and ELT Development: Building robust and modular data pipelines using dbt. Writing efficient SQL for data transformation and performance tuning in Snowflake. Managing environments, sources, and deployment pipelines in dbt. o Cloud Data Platform Expertise: Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages. Technical Toolset: o Languages & Frameworks: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Best Practices and Standards: o Knowledge of modern data architecture concepts including layered architecture (e.g., staging, intermediate, marts, Medallion architecture). Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs). Security & Governance: o Access and Permissions: Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. Familiar with data privacy policies (GDPR basics), encryption at rest/in transit. Deployment & Monitoring: o DevOps and Automation: Version control using Git, experience with CI/CD practices in a data context. Monitoring and logging of pipeline executions, alerting on failures. Soft Skills: o Communication & Collaboration: Ability to present solutions and handle client demos/discussions. Work closely with onshore and offshore team of analysts, data scientists, and architects. Ability to document pipelines and transformations clearly. Basic Agile/Scrum familiarity working in sprints and logging tasks. Comfort with ambiguity, competing priorities and fast-changing client environment. Education: o Bachelors or masters degree in computer science, Data Engineering, or a related field. o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Architect / Engagement Lead Location: Chennai Reports To: CEO About the Company: Ignitho Inc. is a leading AI and data engineering company with a global presence, including US, UK, India, and Costa Rica offices. Visit our website to learn more about our work and culture: www.ignitho.com. Ignitho is a portfolio company of Nuivio Ventures Inc ., a venture builder dedicated to developing Enterprise AI product companies across various domains, including AI, Data Engineering, and IoT. Learn more about Nuivio at: www.nuivio.com. Job Summary: As the Data Architect and Engagement Lead, you will define the data architecture strategy and lead client engagements , ensuring alignment between data solutions and business goals . This dual role blends technical leadership with client-facing responsibilities. Key Responsibilities: Design scalable data architectures, including storage, processing, and integration layers. Lead technical discovery and requirements gathering sessions with clients. Provide architectural oversight for data and AI solutions . Act as a liaison between technical teams and business stakeholders . Define data governance, security, and compliance standards . Required Qualifications: Bachelors or Masters in computer science, Information Systems, or similar. 7+ years of experience in data architecture, with client-facing experience. Deep knowledge of data modelling , cloud data platforms (Snowflake / BigQuery/ Redshift / Azure), and orchestration tools. Excellent communication, stakeholder management, and technical leadership skills. Familiarity with AI/ML systems and their data requirements is a strong plus.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 - 0 Lacs

Hyderabad

Work from Office

Naukri logo

Snowflake Developer Job Location: Hyderabad Description: We are seeking a talented SNOWFLAKE ETL/ELT Engineer to join our growing Data Engineering team. The ideal candidate will have extensive experience designing, building, and maintaining scalable data integration solutions in Snowflake. Responsibilities: Design, develop, and implement data integration solutions using Snowflake's ELT features Load and transform large data volumes from a variety of sources into Snowflake Optimize data integration processes for performance and efficiency Collaborate with other teams, such as Data Analytics and Business Intelligence, to ensure the integration of data into the data warehouse meets their needs Create and maintain technical documentation for ETL/ELT processes and data structures Stay current with emerging trends and technologies related to Snowflake, ETL, and ELT Requirements: 3-6 years of experience in data integration and ETL/ELT development Extensive experience with Snowflake, including its ELT features Experience with advanced Snowflake Features including AIML Strong proficiency in SQL, PYTHON and data transformation techniques Experience with cloud-based data warehousing and data integration tools Knowledge of data warehousing design principles and best practices Excellent communication and collaboration skills If you have a passion for data engineering and a proven track record of success in Snowflake, ETL, and ELT, we want to hear from you! Please share below details: CTC- ECTC- Notice- Relevant experience in snowflake development- current location- Willing to work from Hyderabad office(Y/N)-

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Noida, Hyderabad, Bengaluru

Hybrid

Naukri logo

We at EMIDS , are hiring for Azure DBA . Please find the details below and share your interest at aarati.pardhi@emids.com Location : Bangalore / Hyderabad / Noida (Hybrid) Exp : 5+ yrs Job Description : MS SQL Server 2016/2019/2022,Azure DB,Database Admin,Azure Devops,Security & Compliance Exposure to modern data platforms is a + (DataBricks, Foundry, Snowflake, Fabric, etc..) Experience with: Database Installation, Configuration, and Version Upgrades Install, configure, and upgrade SQL Server instances across different environments (development, testing, and production). Set up and configure database storage, memory allocation, and system parameters. Implement best practices for database standardization and configuration management. Database Performance Monitoring & Optimization Continuously monitor database health, performance, and resource utilization using SQL Server tools like SQL Server Profiler, Performance Monitor, and Dynamic Management Views (DMVs). Optimize SQL queries, indexes, and stored procedures to improve response times. Identify and resolve slow-running queries, blocking, and deadlocks. Implement indexing strategies, partitioning, and caching techniques for efficiency. Backup & Recovery Management Develop and maintain disaster recovery (DR) and high availability (HA) plans, including Log Shipping, Database Mirroring, Always On Availability Groups, and Failover Clustering. Regularly test backups and restoration procedures to ensure data integrity. Implement retention policies and ensure compliance with data recovery objectives (RTO/RPO). Security & Compliance Management Implement user authentication and authorization through Role-Based Access Control (RBAC). Apply Transparent Data Encryption (TDE), row-level security, and auditing policies to ensure data protection. Conduct vulnerability assessments, patch management, and compliance reporting (HIPAA, GDPR, SOX). Monitor and mitigate unauthorized access or security threats. Database Maintenance & Automation Automate routine database maintenance tasks using SQL Agent Jobs, PowerShell, and scripts. Perform index maintenance (rebuilding and reorganizing), statistics updates, and consistency checks

Posted 2 weeks ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Whats in it for you? Pay above market standards The role is going to be contract based with project timelines from 2 12 months, or freelancing Be a part of an Elite Community of professionals who can solve complex AI challenges Work location could be: Remote (Highly likely) Onsite on client location Deccan AIs Office: Hyderabad or Bangalore Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, BigQuery), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities What are the next steps? Register on our Soul AI website

Posted 2 weeks ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, Big Query), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities

Posted 2 weeks ago

Apply

8.0 - 13.0 years

16 - 30 Lacs

Indore, Hyderabad, Pune

Work from Office

Naukri logo

Dear candidate , We have job opening for ETL Developer(ETL+SSIS+Snowflake) with one of our client . If JD suits your profile , please share below details along with your update resume to this email id : shaswati.m@bct-consulting.com JD : Skill 1 - ETL /SSIS Skill 2 - MS SQL Server Skill 3 - SnowFlake Key Responsibilities: Lead the end-to-end migration of databases, stored procedures, views and ETL pipelines from SQL Server to Snowflake. Analyze existing SQL Server schemas, stored procedures, and SSIS packages to design equivalent Snowflake solutions. Re-engineer and optimize ETL workflows using SSIS and Snowflake-native tools (e.g., Snowpipe, Streams, Tasks). Collaborate with data architects, analysts, and business stakeholders to ensure data integrity and performance. Develop and maintain documentation for migration processes, data mappings, and transformation logic. Monitor and troubleshoot data pipelines and performance issues post-migration. Ensure compliance with data governance and security standards during migration. Tools & Technology: SQL SERVER, SSIS, SQL, Snowflake, Required Skills & Qualifications: 8-10 years of experience with Microsoft SQL Server, including T-SQL and performance tuning. Good hands-on experience with SSIS for ETL development and deployment. Decent experience with Snowflake, including SnowSQL, data modeling, and performance optimization. Strong understanding of data warehousing concepts and cloud data architecture. Experience with version control systems (e.g., Git) and CI/CD pipelines for data workflows. Excellent problem-solving and communication skills. Good to have knowledge on Azure Data Factory.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

12 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool. For Transformation good to have DBT experience 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Soniya soniya05.mississippiconsultants@gmail.com We are a Recruitment firm based in Pune, having various clients globally.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

20 - 30 Lacs

Pune

Hybrid

Naukri logo

Job Summary : We are looking for a highly skilled AWS Data Engineer with over 5 years of experience in designing, developing, and maintaining scalable data pipelines on AWS. The ideal candidate will be proficient in data engineering best practices and cloud-native technologies, with hands-on experience in building ETL/ELT pipelines, working with large datasets, and optimizing data architecture for analytics and business intelligence. Key Responsibilities : Design, build, and maintain scalable and robust data pipelines and ETL processes using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3, Athena). Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver high-quality solutions. Implement data lake and data warehouse architectures, ensuring data governance, data quality, and compliance. Optimize data pipelines for performance, reliability, scalability, and cost. Automate data ingestion and transformation workflows using Python, PySpark, or Scala. Manage and monitor data infrastructure including logging, error handling, alerting, and performance metrics. Leverage infrastructure-as-code tools like Terraform or AWS CloudFormation for infrastructure deployment. Ensure security best practices are implemented for data access and storage (IAM, KMS, encryption, etc.). Document data processes, architectures, and standards. Required Qualifications : Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of experience as a Data Engineer with a focus on AWS cloud services. Strong experience in building ETL/ELT pipelines using AWS Glue, EMR, Lambda , and Step Functions . Proficiency in SQL , Python , PySpark , and data modeling techniques. Experience working with data lakes (S3) and data warehouses (Redshift, Snowflake, etc.) . Experience with Athena , Kinesis , Kafka , or similar streaming data tools is a plus. Familiarity with DevOps and CI/CD processes, using tools like Git , Jenkins , or GitHub Actions . Understanding of data privacy, governance, and compliance standards such as GDPR, HIPAA, etc. Strong problem-solving and analytical skills, with the ability to work in a fast-paced environment.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 22 Lacs

Bengaluru

Remote

Naukri logo

Collaborate with senior stakeholders to gather requirements, address constraints, and craft adaptable data architectures. Convert business needs into blueprints, guide agile teams, maintain quality data pipelines, and drive continuous improvements. Required Candidate profile 7+yrs in data roles(Data Architect/Engineer). Skilled in modelling (incl. Data Vault 2.0), Snowflake, SQL/Python, ETL/ELT, CI/CD, data mesh, governance & APIs. Agile; strong stakeholder & comm skills. Perks and benefits As per industry standards

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 - 3 Lacs

Chennai

Work from Office

Naukri logo

Data Engineer (Azure ADF, Blob Storage, Snowflake, SQL, Data Analysis) Good to have Certification in Azure and Snowflake Experience with other Azure services such as Azure Synapse Analytics, Databricks, or Logic Apps. Familiarity with CI/CD pipelines for data workflows (Azure Repo, GitHub). Understanding of data governance and compliance standards. Knowledge of in data visualization tools such as Power BI, Tableau, or similar platforms. Domain knowledge in Supply Chain Management (SCM)

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

Position/Role - Data Engineer Experience: 7-10 yrs Notice period : Imm - 30 days Location Bangalore/Hyderabad Job Overview: Primary Skills: Snowflake (4+ years of experience), Python, PySpark, SQL Stored Procedures Secondary Skills: AWS Services (e.g., S3, IAM, Glue, Lambda). Kindly, share the following details : Updated CV Relevant Skills Total Experience Current Company Current CTC Expected CTC Notice Period Current Location Preferred Location

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Snowflake Data Engineering Professional Location: Bangalore Experience: The ideal candidate should possess at least 8 years of experience in Snowflake with a focus on Data Engineering. Primary Skills: Proficiency in Snowflake, DBT, and AWS. Good to have Skills: Familiarity with Fivetran (HVR) and Python. Responsibilities: Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and troubleshoot existing data workflows to ensure efficiency and reliability. Implement best practices for data management and governance. Stay updated with the latest industry trends and technologies to continuously improve the data infrastructure. Required Skills: Strong experience in data modeling, ETL processes, and data warehousing. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Skills: Knowledge of Fivetran (HVR) and Python. Familiarity with data integration tools and techniques. Ability to work in a fast-paced and agile environment. Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Requirements Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. 8 years of relevant experience in Snowflake with Data Engineering. Proficiency in Snowflake, DBT, and AWS. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities.

Posted 2 weeks ago

Apply

7.0 - 8.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

Role: Dataiku Developer Location: PAN India Notice: Immediate to 60 days Job Description: Hands-on experience in DataIKU along with good knowledge of SQL. Should be able to understand and translate business needs into data models supporting long-term solutions. Must have experience in converting SQL/ Alteryx scripts in Dataiku. Develop and design efficient Dataiku workflows by applying best practices. Design & execute flowzone development plan for existing Alteryx flows/ SQL scripts. Experience in working with large datasets, managing huge volumes of data, and evaluation while working with multiple data sources, decompose high-level information into details for downstream reporting and ad-hoc analysis. Ability to understand the business and design/ develop KPIs for business decision making. Experience of data gathering, data exploration, transformation, data analysis, data mining and data quality and making analytics ready dataset Good knowledge of metadata management, data modeling Strong analytical and problem-solving skills Role & Responsibility Hands on experience with DataIKU and SQL /Python functions & programs Optimization of existing flowzones to save time & improve process efficiency Efficiently use global variables and procedures to simplify the coding process. Write Complex queries on multiple tables to create unified view of the data. Convert existing Alteryx workflows/ SQL scripts in Dataiku. Prepare technical specifications and documentation for supporting BI reports Skills & Experience 3+ years’ experience in SQL Handson experiences of working with DataIKU. Must have a certification as a Core Designer. Knowledge of databases and good knowledge in statistical modeling and ability to develop complex models. Good Data Analysis skills in identify different trends, patterns other data anomaly. Ability to write macros, advanced queries, models, functions & formulae. Collaborate with client Leads to coordinate for data extraction process, discuss results with engagement teams. Snowflake environment would be advantageous. Should be able to bring new ideas and innovative solutions to our clients.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 15 Lacs

Kolkata, Gurugram, Chennai

Hybrid

Naukri logo

Sr Data Engineer (SQL , Python , Snowflake , AWS) Exp- 4 to 7 Years We are looking for a Data Engineer with expertise in SQL and Python, along with foundational knowledge of AWS Glue ETL. The ideal candidate should have experience in access roles, policies, and RBAC/PBAC implementation, especially within Snowflake. Working in agile product teams is essential, and certifications in AWS Cloud and Snowflake Engineering are highly preferred . Roles and Responsibilities Data engineering Write efficient SQL queries to process and manipulate large datasets. Develop and optimize Python-based data processing scripts and workflows. Design, develop, and maintain scalable ETL pipelines using AWS Glue. Participate in agile development processes, including sprint planning, stand-ups, and retrospectives. Work on cloud-based data solutions, leveraging AWS and Snowflake best practices. Basic knowledge of access control mechanisms, including RBAC (Role-Based Access Control) and PBAC (Policy-Based Access Control) within Snowflake. Configure and monitor Snowflake access roles, policies, and user permissions Required Qualifications: 3-5 years of experience in data engineering or a related field. Strong proficiency in SQL and Python . Basic understanding of AWS Glue ETL and its functionalities. Experience working in an agile product development environment. Basic knowledge of access roles, policies, RBAC, PBAC, and Snowflake access role management. Strong understanding of cloud-based data solutions (AWS, Snowflake). Excellent problem-solving and analytical skills. Preferred Qualifications: AWS Cloud certification (AWS Certified Data Analytics, AWS Certified Solutions Architect, etc.) Snowflake Engineering certification . Experience with big data processing frameworks (Spark, Hadoop). Knowledge of data governance and security best practices

Posted 2 weeks ago

Apply

6.0 - 8.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Seeking ETL Developer with expertise in Informatica IICS/PowerCenter, strong SQL skills, Snowflake integration, and cloud apps. Design high-quality ETL pipelines, ensure performance, mentor juniors, and collaborate across teams.

Posted 2 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies