Home
Jobs

193 Dbt Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5 - 9 years

20 - 25 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

We're looking for a motivated and detail-oriented Senior Snowflake Developer with strong SQL querying skills and a willingness to learn and grow with our team. As a Senior Snowflake Developer, you will play a key role in developing and maintaining our Snowflake data platform, working closely with our data engineering and analytics teams. Responsibilities: Write ingestion pipelines that are optimized and performant Manage a team of Junior Software Developers Write efficient and scalable SQL queries to support data analytics and reporting Collaborate with data engineers, architects and analysts to design and implement data pipelines and workflows Troubleshoot and resolve data-related issues and errors Conduct code reviews and contribute to the improvement of our Snowflake development standards Stay up-to-date with the latest Snowflake features and best practices Requirements: 5+ years of experience with Snowflake Strong SQL querying skills, including data modeling, data warehousing, and ETL/ELT design Advanced understanding of data engineering principles and practices Familiarity with Informatica Intelligent Cloud Services (IICS) or similar data integration tools is a plus Excellent problem-solving skills, attention to detail, and analytical mindset Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Nice to Have: Experience using Snowflake Streamlit, Cortex Knowledge of data governance, data quality, and data security best practices Familiarity with Agile development methodologies and version control systems like Git Certification in Snowflake or a related data platform is a plus

Posted 1 month ago

Apply

5 - 7 years

15 - 25 Lacs

Pune, Mumbai (All Areas)

Hybrid

Naukri logo

DUTIES AND RESPONSIBILITIES: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented • Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. SUPERVISORY RESPONSIBILITIES: This job has no supervisory responsibilities. QUALIFICATIONS: Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work 3-5 years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) • Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker • Excellent communicator Well-developed interpersonal skills Good at prioritizing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)fepa

Posted 1 month ago

Apply

3 - 8 years

6 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Novastrid is hiring an experienced Data Engineer for a leading Tier-1 company in Hyderabad. If you're passionate about building robust, scalable data systems and working with cutting-edge big data technologies, this is your opportunity to work with one of the best in the industry. Role & responsibilities Design and implement scalable, high-performance batch and real-time data pipelines using Apache Spark , Kafka , Java , and SQL Build and maintain ETL/ELT frameworks handling structured, semi-structured, and unstructured data Work on streaming data solutions using Spark Structured Streaming and Kafka Develop and optimize data models , implement data warehousing solutions on AWS / Azure / GCP Automate and orchestrate workflows using Apache Airflow , DBT , or equivalent tools Collaborate with cross-functional teams (Data Science, Product, Engineering) Monitor, troubleshoot, and ensure reliability of data systems Follow best practices in data governance , security , and cloud cost optimization Preferred candidate profile 3 to 8 years of hands-on experience in Data Engineering / Big Data Development Strong expertise in: Apache Spark Kafka Java (production-grade experience) Advanced SQL Python/Scala (optional but a plus) Experience with cloud platforms (AWS / Azure / GCP) Familiarity with Git , CI/CD pipelines , and modern data ops practices Good to Have Experience with NoSQL (MongoDB, Cassandra, DynamoDB) Exposure to Docker , Kubernetes Domain experience in Banking / FinTech / Financial Services Educational Qualifications Bachelor's or Masters degree in Computer Science , Information Systems , Data Engineering , or a related field

Posted 1 month ago

Apply

4 - 6 years

12 - 15 Lacs

Hyderabad

Remote

Naukri logo

Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities: 1. Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. 2. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. 3. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. 4. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. 5. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. 6. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. 7. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. 8. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.

Posted 1 month ago

Apply

- 2 years

3 - 8 Lacs

Lucknow

Hybrid

Naukri logo

Develop and maintain scalable data pipelines. Collaborate with data scientists and analysts to support business needs. Work with cloud platforms like AWS, Azure, or Google Cloud. Effectively working with cross-functional teams. Data Modelling.

Posted 1 month ago

Apply

4 - 9 years

0 - 0 Lacs

Bengaluru

Remote

Naukri logo

Hi , Synergy Technologies is a leader in technology services and consulting. We enable clients across the world to create and execute strategies .We help our clients find the right problems to solve, and to solve these effectively. We bring our expertise and innovation to every project we undertake Position: Business Intelligence Developer Duration : Contract to Full Time Location : Remote Work ( Remote Work ) JD Required qualications include: Business Intelligence Developer Opportunity Our mission is clear: to enhance the safety and well-being of workers across the globe. As a trailblazer in software solutions, we empower businesses and their suppliers with a platform that champions safety, sustainability, and risk management within supply chains. Join our close-knit team of Data Systems and Internal Business Intelligence experts, where you can live out our core values daily and contribute to impactful projects that further the companys vision. About the Role As a Business Intelligence Developer , you will play a critical role in developing impactful business intelligence solutions that empower internal teams with data-driven insights for strategic decision-making. Working closely with business analysts, data engineers, and stakeholders, youll design and build data models, interactive reports, and dashboards to transform complex data into clear, actionable insights. Your efforts will ensure data quality, accuracy, and governance while enhancing accessibility for business users. Key Responsibilities Develop BI Solutions: Design, develop, and implement data models, dashboards, and reports using Power BI to support data-driven initiatives. Data Modeling & Integration: Collaborate with data engineers and analysts to create optimized data models that aggregate data from multiple sources, ensuring scalability and alignment with business needs. Enhance Data Accuracy: Continuously improve data accuracy, standardize key metrics, and refine reporting processes to drive operational efficiency. Ensure Data Governance: Adhere to the companys data governance policies, ensuring that all BI solutions comply with data security standards, especially for sensitive information. Optimize BI Performance: Monitor BI solutions to ensure performance and reliable data access, implementing enhancements as needed. Documentation & User Support: Maintain comprehensive documentation of dashboards, data models, and processes; provide end-user training to maximize tool effectiveness. Adapt and Innovate: Stay informed on BI best practices and emerging technologies to proactively enhance BI capabilities. Qualifications Education: Bachelors Degree in Data Science, Business Analytics, Computer Science, or a rrelated field. Experience: Minimum of 5 years in business intelligence development, including data modeling, reporting, and dashboard creation. Power BI Expertise: Strong experience with Power BI, including advanced DAX calculations, data modeling, and creating visually engaging, actionable dashboards. dbt Labs Cloud IDE: At least 1 year of hands-on experience with dbt Labs Cloud IDE is required. Technical Skills: Proficiency in SQL and modern cloud-based data warehousing concepts, with experience in Snowflake, SQL Server, or Redshift. Cloud and ERP/CRM Proficiency: Familiarity with platforms such as NetSuite, Salesforce, Fivetran, and API integrations; experience with SaaS systems like Zuora Billing, Churn Zero, Marketo, and Qualtrics is a plus. Communication Skills: Ability to translate technical insights into business-friendly language. Preferred Skills Certifications: Power BI, Snowflake, or similar BI tools. Portfolio: Ability to provide redacted samples of Power BI dashboards. SaaS Experience: Background in SaaS organizations is beneficial.

Posted 1 month ago

Apply

10 - 18 years

20 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

Job Summary: We are looking for an experienced and highly skilled Senior Python Developer with strong hands-on expertise in Snowflake to join our growing data engineering team. The ideal candidate will have a solid background in building scalable data pipelines, data modeling, and integrating Python-based solutions with Snowflake. Roles and Responsibilities: Design, develop, and maintain scalable and efficient data pipelines using Python and Snowflake. Collaborate with data architects and analysts to understand data requirements and translate them into technical solutions. Write complex SQL queries and stored procedures in Snowflake. Optimize Snowflake performance using best practices for data modeling, partitioning, and caching. Develop and deploy Python-based ETL/ELT processes. Integrate Snowflake with other data sources, APIs, or BI tools. Implement and maintain CI/CD pipelines for data solutions. Ensure data quality, governance, and security standards are maintained. Required Skills and Qualifications: Strong programming skills in Python with a focus on data processing and automation. Hands-on experience with Snowflake including SnowSQL, Snowpipe, data sharing, and performance tuning. Proficiency in SQL and working with large, complex datasets. Experience in designing and implementing ETL/ELT pipelines. Strong understanding of data warehousing concepts and data modeling (star/snowflake schema). Familiarity with cloud platforms such as AWS , Azure , or GCP . Experience with version control (e.g., Git) and CI/CD tools. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Experience with Apache Airflow , DBT , or other workflow orchestration tools. Knowledge of data security and compliance standards . Experience integrating Snowflake with BI tools (Tableau, Power BI, etc.). Certification in Snowflake or relevant cloud platforms is a plus.

Posted 1 month ago

Apply

8 - 12 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role : Snowflake Developer Experience : 8 years - 12 years Expert in python snowflake SQL and Github Experience in Dagster or Airflow is Must Should be able to grasp landscape quickly to test and approve Merge requests from Data Engineers Data Modelling and Architectural level knowledge is needed Should be able to establish connectivity from different source systems like SAP Beeline to existing setup and take ownership of it

Posted 1 month ago

Apply

7 - 10 years

17 - 25 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Data Engineer DBT, Snowflake, Looker Location: remote Experience: 7–10 years About the Role We are looking for an experienced Data Engineer to design and build scalable data pipelines and enable powerful business insights. You’ll work with modern data stack tools like DBT, Snowflake , and Looker to empower data-driven decisions. Key Responsibilities Design & maintain scalable data pipelines (DBT, Snowflake) Perform data transformation , cleansing & enrichment Integrate data from multiple sources into data warehouse/data lake Support reporting, analytics & BI with Looker / similar tools Optimize performance & troubleshoot data workflows Document processes & ensure data quality Skills Required DBT, Snowflake, Looker (or similar tools) Strong SQL , Python (or similar scripting) Data modeling, schema design, database optimization Problem-solving & business requirement translation Excellent communication & cross-functional collaboration Drop your resume at: bhavikas@overturerede.com Contact: 7428694900 We’re hiring! Don’t miss the chance to work with cutting-edge data platforms and make an impact. Reach out now!

Posted 1 month ago

Apply

4 - 8 years

0 - 1 Lacs

Mohali

Work from Office

Naukri logo

Job Title : Snowflake Developer (4+ years' experience) Location : F, 384, Sector 91 Rd, Phase 8B, Industrial Area, Sector 91, Sahibzada Ajit Singh Nagar, Punjab 160055. Job Type : Fulltime (In-house) Job Overview : We are looking for an experienced Snowflake Developer with 4+ years of hands-on experience in Snowflake Data Warehouse and related tools. You will be responsible for building, managing, and optimizing Snowflake data pipelines, assisting in data integration, and contributing to overall data architecture. The ideal candidate should have a strong understanding of data modeling, ETL processes, and experience working with cloud-based data platforms. Responsibilities : Design, develop, and maintain Snowflake Data Warehouses. Create and manage Snowflake schema, tables, views, and materialized views. Implement ETL processes to integrate data from various sources into Snowflake. Optimize query performance and data storage in Snowflake. Work with stakeholders to define data requirements and provide technical solutions. Collaborate with Data Engineers, Data Scientists, and Analysts to build efficient data pipelines. Monitor and troubleshoot performance issues in Snowflake environments. Automate repetitive data processes and report generation tasks. Ensure data integrity, security, and compliance with data governance policies. Assist in data migration and platform upgrades. Required Skills : 4+ years of experience working with Snowflake Data Warehouse . Proficient in SQL , SnowSQL , and ETL processes . Strong experience in data modeling and schema design in Snowflake. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data pipelines , data lakes, and data integration tools. Experience in query optimization and performance tuning in Snowflake. Understanding of data governance and best practices. Strong knowledge of data security and privacy policies in a cloud environment. Experience in using tools like dbt , Airflow , or similar orchestration tools is a plus. #Salary: No bar for deserving candidates. Location: - Mohali Punjab (Work from office) Shift:- Night Shift Other Benefits: 5 Days working US based work culture and environment Indoor and Outdoor events Paid Leaves Health Insurance Employee engagement activities like month end & festival celebration, team outing, birthday celebrations. Gaming and sports area Please comment/DM to know more. You may also e-mail your resume to me at priyankaaggarwal@sourcemash.com

Posted 1 month ago

Apply

2 - 3 years

10 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

Role & responsibilities Develop and maintain ETL pipelines using Python & SQL. Work with Airflow for workflow orchestration. Implement data transformations using dbt. Optimize database performance and manage data warehousing. Collaborate on scalable data solutions using C#/Java. Preferred candidate profile 2-3 years of experience in data engineering. • Strong expertise in SQL, Python, and ETL processes. • Hands-on experience with Airflow, dbt, and modern data platforms. • Knowledge of cloud data services (AWS/GCP/Azure) is a plus.

Posted 2 months ago

Apply

10 - 19 years

15 - 25 Lacs

Bengaluru

Remote

Naukri logo

We are seeking a skilled Architect with expertise in Azure DevOps automation, focusing on Snowflake and DBT (Data Build Tool). The ideal candidate will be responsible for designing, implementing, and maintaining automated workflows and pipelines to support efficient and scalable data platform solutions. This role requires strong technical expertise in DevOps practices, automation, and cloud-based data technologies. Key Responsibilities: • Design and implement Azure DevOps pipelines and workflows to automate Snowflake and DBT processes. • Develop and maintain CI/CD pipelines for data transformation, integration, and deployment. • Collaborate with data engineers, analysts, and stakeholders to understand requirements and deliver efficient solutions. • Ensure the scalability, reliability, and security of automated processes and workflows. • Monitor and troubleshoot pipeline performance, identifying and resolving bottlenecks or issues. • Develop and maintain technical documentation for workflows, best practices, and configurations. • Stay updated with industry trends and emerging tools to enhance automation capabilities. Required Skills and Qualifications: • Proven experience as an Architect or Senior Engineer specializing in Azure DevOps automation. • In-depth knowledge of Snowflake architecture and its integrations. • Hands-on experience with DBT for data transformation and modeling. • Proficiency in scripting languages (Python, PowerShell, etc.) for automation. • Strong understanding of CI/CD principles and best practices. • Experience with version control systems like Git. • Familiarity with cloud-based data platforms and services. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration abilities. Preferred Qualifications: • Certifications in Azure or Snowflake are a plus. • Experience with other data tools and platforms is advantageous.

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Chennai, Bengaluru, Noida

Hybrid

Naukri logo

We are hosting an Open Walk-in Drive in Pune on 5th April [Saturday] 2025. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience 5 years to 10 years Time: 9.30 AM to 4:00 PM Point of Contact: Aishwarya G / aishwaryag5@hexaware.com Venue: Hexaware Technologies Ltd, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pune, Pimpri-Chinchwad, Maharashtra 411057 Key Skills and Experience: Must have 5 - 10 years of experience in Data warehouse, ETL, BI projects Must have atleast 4+ years of experience in Snowflake Expertise in Snowflake architecture is must. Must have atleast 3+ years of experience and strong hold in Python/PySpark Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Good to have experience with AWS services and creating DevOps templates for various AWS services. Experience in using Github, Jenkins Good communication and Analytical skills Snowflake certification is desirable What to Bring: Updated resume Photo ID, Passport size photo Mention "Aishwarya G" at the top of your resume. How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at aishwaryag5@hexaware.com We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Pune, Coimbatore, Mumbai (All Areas)

Hybrid

Naukri logo

We are hosting an Open Walk-in Drive in Pune on 5th April [Saturday] 2025. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience 5 years to 10 years Time: 9.30 AM to 4:00 PM Point of Contact: Aishwarya G / aishwaryag5@hexaware.com Venue: Hexaware Technologies Ltd, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pune, Pimpri-Chinchwad, Maharashtra 411057 Key Skills and Experience: Must have 5 - 10 years of experience in Data warehouse, ETL, BI projects Must have atleast 4+ years of experience in Snowflake Expertise in Snowflake architecture is must. Must have atleast 3+ years of experience and strong hold in Python/PySpark Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Good to have experience with AWS services and creating DevOps templates for various AWS services. Experience in using Github, Jenkins Good communication and Analytical skills Snowflake certification is desirable What to Bring: Updated resume Photo ID, Passport size photo Mention "Aishwarya G" at the top of your resume. How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at aishwaryag5@hexaware.com We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

2 - 6 years

9 - 12 Lacs

Pune

Work from Office

Naukri logo

Data Warehouse Developer to design, develop, and deploy ETL pipelines that meet client requirements. Proficiency in SQL, ETL tools and cloud platforms . Strong analytical, problem-solving, and communication skills are essential.

Posted 2 months ago

Apply

4 - 9 years

7 - 12 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Job Summary: Were looking for an experienced professional with strong expertise in Snowflake (AWS), Airflow, DBT, Python, and SQL to develop and optimize scalable data solutions. The ideal candidate will have a deep understanding of data warehousing, ETL/ELT pipelines, cloud platforms, and analytics reporting. This role requires hands-on experience in building, managing, and optimizing data pipelines while ensuring data integrity, security, and compliance. Key Responsibilities: Design, develop, and optimize Snowflake-based data solutions. Implement and manage ETL/ELT workflows using DBT, Airflow, Informatica, Pentaho, or Fivetran. Write and optimize SQL queries for efficient data retrieval and transformation. Work with AWS cloud services (Lambda, S3, SNS/SQS, EC2) for data automation and integration. Develop and maintain data pipelines to support analytics and reporting needs. Ensure data quality, transformation, normalization, and aggregation as per business requirements. Perform query performance tuning and troubleshooting in production environments. Support CI/CD deployments, change management, and root cause analysis (RCA). Develop functional business metrics across domains such as finance, retail, and telecom. Collaborate with cross-functional teams to ensure data security, compliance, and governance. Qualifications & skills: Mandatory Skills: Snowflake (AWS): 4+ years of experience in advanced SQL and Snowflake development. Airflow: Experience in workflow orchestration and scheduling. DBT (Data Build Tool): Hands-on expertise in data transformation and modeling. Python: 3+ years of experience in advanced scripting and automation. SQL: Strong query optimization and data processing skills. Technical Skills: Data Warehousing: 4+ years of experience in data modeling, star schema, normalization/denormalization. ETL/ELT Development: 3+ years of experience in DBT, Informatica, Pentaho, or Fivetran. Cloud Platforms: 3+ years of hands-on experience with AWS or any cloud environment. Data Analytics & Reporting: 4+ years of experience in data profiling, metric development, and performance tuning. Soft Skills: Strong written and verbal communication skills for stakeholder collaboration. Ability to work in a team environment and support cross-functional projects. Experience working in enterprise environments, following best practices for CI/CD, security, change management, RCA, and on-call rotations. Preferred Qualifications: Technical Certifications in AWS / Snowflake. 4+ years of experience in ETL/ELT tools (DBT, Informatica, FiveTran). 4+ years of experience in industry-specific metric development (finance, retail, telecom). Team leadership experience and exposure to large-scale data support environments. Location-Hyderabad,Bengaluru,Chennai,Pune,Kolkata

Posted 2 months ago

Apply

4 - 9 years

6 - 11 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Job Summary: Were looking for an experienced professional with strong expertise in Snowflake (AWS), Airflow, DBT, Python, and SQL to develop and optimize scalable data solutions. The ideal candidate will have a deep understanding of data warehousing, ETL/ELT pipelines, cloud platforms, and analytics reporting. This role requires hands-on experience in building, managing, and optimizing data pipelines while ensuring data integrity, security, and compliance. Key Responsibilities: Design, develop, and optimize Snowflake-based data solutions. Implement and manage ETL/ELT workflows using DBT, Airflow, Informatica, Pentaho, or Fivetran. Write and optimize SQL queries for efficient data retrieval and transformation. Work with AWS cloud services (Lambda, S3, SNS/SQS, EC2) for data automation and integration. Develop and maintain data pipelines to support analytics and reporting needs. Ensure data quality, transformation, normalization, and aggregation as per business requirements. Perform query performance tuning and troubleshooting in production environments. Support CI/CD deployments, change management, and root cause analysis (RCA). Develop functional business metrics across domains such as finance, retail, and telecom. Collaborate with cross-functional teams to ensure data security, compliance, and governance. Qualifications & skills: Mandatory Skills: Snowflake (AWS): 4+ years of experience in advanced SQL and Snowflake development. Airflow: Experience in workflow orchestration and scheduling. DBT (Data Build Tool): Hands-on expertise in data transformation and modeling. Python: 3+ years of experience in advanced scripting and automation. SQL: Strong query optimization and data processing skills. Technical Skills: Data Warehousing: 4+ years of experience in data modeling, star schema, normalization/denormalization. ETL/ELT Development: 3+ years of experience in DBT, Informatica, Pentaho, or Fivetran. Cloud Platforms: 3+ years of hands-on experience with AWS or any cloud environment. Data Analytics & Reporting: 4+ years of experience in data profiling, metric development, and performance tuning. Soft Skills: Strong written and verbal communication skills for stakeholder collaboration. Ability to work in a team environment and support cross-functional projects. Experience working in enterprise environments, following best practices for CI/CD, security, change management, RCA, and on-call rotations. Preferred Qualifications: Technical Certifications in AWS / Snowflake. 4+ years of experience in ETL/ELT tools (DBT, Informatica, FiveTran). 4+ years of experience in industry-specific metric development (finance, retail, telecom). Team leadership experience and exposure to large-scale data support environments. Location : - Hyderabad,Bengaluru,Chennai,Pune,Kolkata

Posted 2 months ago

Apply

5 - 8 years

13 - 19 Lacs

Pune

Work from Office

Naukri logo

CANDIDATE SHOULD BE AVAILABLE FOR FACE TO FACE INTERVIEW ON 12TH APRIL AND SHOULD BE READY TO WORK FROM OFFICE 5 DAYS A WEEK A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and architectures that enable the collection, transformation, and storage of large datasets. Ensure data quality and reliability, support data-driven decision-making, and facilitate the integration of various data sources into centralized systems. Develop and manage data pipelines for extracting, loading, and transforming data from multiple sources. Work with open-source and cloud-based databases (e.g., PostgreSQL, Snowflake, BigQuery, Redshift). Automate database operations and ETL tasks using programming languages such as Python and frameworks like Spark. Implement CI/CD practices and version control to streamline deployments. Ensure efficient and reliable orchestration using tools like Apache Airflow, Prefect, or Dragster. Experience working on API integration and real time streaming. Database: Preferred (PostgresQL, MongoDB),or any RDBMS,NOSQL databases Programming languages : Python, Spark, SQL, DBT orchestration : Apache Airflow, Prefect, NIFI Cloud tech : AWS (e.g., S3, Redshift, Glue), Google Cloud Platform (e.g., BigQuery, Cloud Composer) Streaming : Apache Kafka / google cloud pub/sub Devops : Docker / Kubernetes , GIT,

Posted 2 months ago

Apply

5 - 10 years

20 - 35 Lacs

Chennai

Hybrid

Naukri logo

Design, build, and test end-to-end data pipelines for Data Ingestion, Integration, and Curation. Implement automation for data workflows and testing to enhance efficiency and reliability with data quality checks. Continuously optimize data pipelines to support diverse workloads and business requirements, improving performance over time. Create, maintain, and scale our cloud-based data platform ensuring high availability and scalability. Address and minimize technical debt to ensure a streamlined and efficient codebase. Develop reusable frameworks and components to streamline data engineering processes and boost development efficiency. Lead key data engineering projects from inception to successful delivery, taking ownership and ensuring completion with minimal oversight. Collaborate closely with analysts, business partners, and other stakeholders to understand data requirements and deliver high-quality solutions. Document data processes, workflows, and systems comprehensively to maintain clarity and transparency. What You Bring to the Table: Experience with data processing tools such as Snowflake, DBT, Databricks, Azure (ADF and Fabric), and GCP - BigQuery. Expertise in database optimization (partitioning, group and sort keys, indexes, query optimization). Hands-on coding experience with languages like Python, PySpark, and SQL to access, extract, manipulate, and summarize data. Programming and/or scripting experience: Python, PySpark. Experience in automation and testing of data workflows, preferably Azure ADF. Familiarity with a broad base of analytical methods like data modeling, variable-based transformation & summarization, and algorithmic development. Whats needed- Basic Qualifications: 6+ years of hands-on experience in designing solutions on the cloud computing platforms like Snowflake, Microsoft Azure or Google Cloud. 8+ years of experience with cloud-based databases, preferably Snowflake or Google BigQuery. 7+ years of experience developing scalable solutions using Python or PySpark. 8+ years of working experience in writing complex SQL’s. 8+ years of working in Agile environments. 5+ years of experience in CI/CD IMMEDIATE- SERVING NP until month end

Posted 2 months ago

Apply

9 - 14 years

30 - 40 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Position: Integration Architect (DBT+ Snowflake) Location: Pune/Chennai/Nagpur/Bengaluru Purpose of the Position: As a Senior Data Integration Developer/ Architect (DBT), this role seeks candidates passionate about specialized skills in Snowflake technology and features. You will be instrumental in assisting our clients by developing models that facilitate their advancement in utilizing Snowflake effectively. Key Result Areas and Activities: Expertise and Knowledge Sharing: Develop and share expertise in DBT & Snowflake Data Modelling and Development. Actively mine and disseminate organizational experience and expertise across teams and clients. 2.Support and Collaboration: Support Cloud and Data Engineering COE initiatives. Collaborate with management to understand and align with company objectives. 3.Real-Time Data and Performance: Ensure DBT solutions are correctly built for collecting real-time data. Perform and deliver effectively in large and complex environments. 4 .Pipeline and Architecture Design: Design, build, test, and maintain Snowflake architectures and data pipelines. 5.Compliance and Security: Ensure compliance with data governance and security policies. Must have: Expertise in Snowflake architecture, understanding of models, cloud platforms integration with snowflake ETL Development, SQL Scripting, working knowledge of stored procedures Proficiency in designing and maintaining data warehouses and data marts. Strong skills in ETL processes and tools (e.g., Informatica, Talend, Snaplogic). Strong problem-solving skills and the ability to work effectively in a collaborative team environment. Experience working on Datawarehouse/ETL projects. 4+ years of Snowflake ETL experience and 3+ DBT experience or equivalent Experience with cloud data platforms (e.g., AWS, Azure)

Posted 2 months ago

Apply

2 years

0 Lacs

Chennai

Work from Office

Naukri logo

Job Description: Data Engineer Position Details Position Title: Data Engineer Department: Data Engineering Location: Chennai Employment Type: Full-Time About the Role We are seeking a highly skilled and motivated Data Engineer with expertise in Snowflake to join our dynamic team. In this role, you will design, build, and optimize scalable data pipelines and cloud-based data infrastructure to ensure efficient data flow across systems. You will collaborate closely with data scientists, analysts, and business stakeholders to provide clean, accessible, and high-quality data for analytics and decision-making. The ideal candidate is passionate about cloud data platforms, data modeling, and performance optimization , with hands-on experience in Snowflake and modern data engineering tools . Key Responsibilities 1. Data Pipeline Development & Optimization Design, develop, and maintain scalable ETL/ELT data pipelines using Snowflake, dbt, and Apache Airflow . Optimize Snowflake query performance, warehouse sizing, and cost efficiency . Automate data workflows to ensure seamless integration between structured and unstructured data sources. 2. Data Architecture & Integration Design and implement data models and schemas optimized for analytics and operational workloads. Manage Snowflake multi-cluster warehouses, role-based access controls (RBAC), and security best practices . Integrate data from multiple sources, including APIs, relational databases, NoSQL databases, and third-party services . 3. Infrastructure & Performance Management Monitor and optimize Snowflake storage, query execution plans, and resource utilization . Implement data governance, security policies, and compliance within Snowflake. Troubleshoot and resolve performance bottlenecks in data pipelines and cloud storage solutions . 4. Collaboration & Continuous Improvement Work with cross-functional teams to define data requirements and ensure scalable solutions . Document technical designs, architecture, and processes for data pipelines and Snowflake implementations . Stay updated with the latest advancements in cloud data engineering and Snowflake best practices . Qualifications Education & Experience Bachelors or Master’s degree in Computer Science, Information Technology, Engineering , or a related field. 2+ years of experience in data engineering , with a strong focus on cloud-based data platforms . Proven expertise in Snowflake , including performance tuning, cost management, and data sharing capabilities . Experience working with cloud platforms (AWS, GCP, or Azure) and distributed computing frameworks (Spark, Hadoop, etc.) . Technical Skills Strong SQL skills for query optimization and data modeling in Snowflake. Experience with ETL tools such as Apache Airflow, dbt, Talend, Informatica, or Matillion . Proficiency in Python, Scala, or Java for data processing and automation . Familiarity with Kafka, Kinesis, or other streaming data solutions . Understanding of data warehousing concepts, partitioning, and indexing strategies . Preferred Qualifications SnowPro Certification or an equivalent cloud data engineering certification . Experience with containerization (Docker, Kubernetes) and CI/CD for data workflows . Knowledge of machine learning pipelines and MLOps . Benefits Competitive salary and performance-based bonuses . Health insurance . Flexible working hours and remote work options . Professional development opportunities , including Snowflake training, certifications, and conferences . Collaborative and inclusive work environment . How to Apply Follow these steps to apply for the Data Engineer position: 1. Submit Your Resume/CV Ensure your resume is updated and highlights your relevant skills, experience, and achievements in Data Engineering. 2. Write a Cover Letter (Optional but Recommended) Your cover letter should include: Why you are interested in the role. Your relevant experience and achievements in data engineering . How your skills align with the job requirements . 3. Provide Supporting Documents (Optional but Recommended) Links to GitHub repositories, research papers, or portfolio projects showcasing your work in data engineering. If you don’t have links, you can attach files (e.g., PDFs) of your projects or research papers. 4. Send Your Application Email your resume, cover letter, and supporting documents to: krishnamoorthi.somasundaram@nulogic.io

Posted 2 months ago

Apply

2 - 6 years

9 - 19 Lacs

Pune, Bengaluru, Noida

Work from Office

Naukri logo

0 Requirement 1 Location Bangalore/Pune/Noida Contract Experience 3 to 6 Years. Job Description – GCP Data Engineer Skills required : Bigdata Workflows (ETL/ELT), Python hands-on, SQL hands-on, Any Cloud (GCP & BigQuery preferred), Airflow (good knowledge on Airflow features, operators, scheduling etc) Skills that would add advantage : DBT, Kafka Experience level : 4 – 5 years NOTE – Candidate will be having the coding test (Python and SQL) in the interview process. This would be done through coder’s-pad. Panel would set it at run-time. Requirement 2 DBT Data Engineer – Job location – Bangalore/Pune/Noida Experience - 2 to 4 Years The Data Engineer that will fit has at least 2 years hands on experience in this role, most of it in e-commerce companies. We expect them to have been focusing on building data pipelines: ETL/ELT processes, modelling data in dbt to clean and process it, connecting to APIs to pull and push data to peripheral applications, working with python, automating/optimising data infrastructures. They will work as part of a team of 6: senior and junior data engineers, and analysts. Our infrastructure is built on Google Cloud Platform: we use Python, data streaming, Google BigQuery, dbt , Prefect and Tableau. We are introducing AI features and predictive functionality in this environment. Tasks: Design, develop, and maintain scalable data pipelines, ETL processes, and data integration workflows using Google BigQuery, dbt, Prefect, and Table Create and implement data models and machine learning algorithms for efficient d transformation Identify and troubleshoot data-related issues, proposing and implementing effecti solutions Connect to peripheral applications: Exponea/Bloomreach, Salesforce, Odoo, Magento to send and receive data. Requirements: Minimum of 2 years of experience as data engineer, handling e-commerce data projects and building data pipelines. Minimum a Bachelor's degree, ideally in Engineering or Science. Strong proficiency in Python and SQ Proven experience in building ETL/ELT processes. Proven experience with data modelling, ideally in dbt. Proven experience with Google BigQuery. Ability to understand the broader technical infrastructure and the role of the data platform within it. Familiarity with Bloomreach Exponea, Salesforce, Tableau, Magento, and Odoo will be advantageous Requirement 3

Posted 2 months ago

Apply

7 - 12 years

15 - 25 Lacs

Delhi NCR, Bengaluru, Hyderabad

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer (Snowflake+Python+Cloud)! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Good to have DBT experience Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply

5 - 8 years

15 - 30 Lacs

Chennai

Hybrid

Naukri logo

Job Title: Data Engineer Designation: Manager Location: Chennai ( Can bare the relocation cost if you are relocating ) Experience Level: 5-8 Years Job Summary: We are seeking an experienced Manager - Data Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining data infrastructure on Azure with extensive focus on Azure Databricks. You will work hand in hand with our analytics team to support data-driven decision making across different external clients in a variety of industries. SCOPE OF WORK Design, build, and maintain scalable data pipelines using Azure Data Factory (ADF), Fivetran and other Azure services. Administer, monitor, and troubleshoot SQL Server databases, ensuring high performance and availability. Develop and optimize SQL queries and stored procedures to support data transformation and retrieval. Implement and maintain data storage solutions in Azure, including Azure Databricks, Azure SQL Database, Azure Blob Storage, and Data Lakes. Collaborate with business analysts, clients and stakeholders to deliver insightful reports and dashboards using Power BI. Develop scripts to automate data processing tasks using languages such as Python, PowerShell, or similar. Ensure data security and compliance with industry standards and organizational policies. Stay updated with the latest technologies and trends in Azure cloud services and data engineering. Desired experience in healthcare data analytics, including familiarity with healthcare data models such as Encounter based models, or Claims focused models or Manufacturing data analytics or Utility Analytics IDEAL CANDIDATE PROFILE Bachelors degree in Computer Science, Engineering, Information Technology, or related field. At least 5-8 years of experience in data engineering with a strong focus on Microsoft Azure and Azure Databricks. Proven expertise in SQL Server database administration and development. Experience in building and optimizing data pipelines, architectures, and data sets on Azure. Experience with dbt and Fivetran Familiarity with Azure AI and LLM’s including Azure OpenAI Proficiency in Power BI for creating reports and dashboards. Strong scripting skills in Python, PowerShell, or other relevant languages. Familiarity with other Azure data services (e.g., Azure Synapse Analytics, Azure Blob..etc). Knowledge of data modeling, ETL processes, and data warehousing concepts. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to collaborate effectively with various teams and understand business requirements. Certifications in Azure Data Engineering or related fields. Experience with machine learning and data science projects (huge plus). Knowledge of additional BI tools and data integration platforms Thanks Aukshaya

Posted 2 months ago

Apply

5 - 10 years

10 - 20 Lacs

Chennai, Pune, Noida

Work from Office

Naukri logo

Interested candidates can share resumes at deepali.rawat@rsystems.com Must Have: SQL Dbt Python Data Quality & Data modelling Good to Have Snowflake db, snowpipe, fivetran Resource should be expert in dbt and SQL, should be able to develop and maintain dbt model, understand data flow, perform data quality, testing of data using dbt etc.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies