Home
Jobs
Companies
Resume

189 Dbt Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

7 - 11 Lacs

Delhi, India

On-site

Foundit logo

Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management

Posted 3 weeks ago

Apply

7.0 - 12.0 years

7 - 11 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management

Posted 3 weeks ago

Apply

9.0 - 13.0 years

25 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

Senior Data Engineer You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake , SnowPro Core certification is a must have. You have used DBT at least in one project to deploy models in production. You have configured and deployed Airflow and integrated various operator in airflow (especially DBT & Snowflake). You can design build, release pipelines and understand of Azure DevOps Ecosystem. You have excellent understanding of Python (especially PySpark) and able to write metadata driven programs. Familiar with Data Vault (Raw , Business) also concepts like Point In Time , Semantic Layer. You are resilient in ambiguous situations and can clearly articulate the problem in a business friendly way. You believe in documenting processes and managing the artifacts and evolving that over time.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Chennai

Remote

Naukri logo

modern data warehouse (Snowflake, Big Query, Redshift) and graph databases. designing and building efficient data pipelines for the ingestion and transformation of data into a data warehouse Proficiency in Python, dbt, git, SQL, AWS and Snowflake.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

data engineering, data management, Snowflake , SQL, data modeling, and cloud-native data architecture AWS, Azure, or Google Cloud (Snowflake on cloud platforms) ETL tools such as Informatica, Talend, or dbt Python or Shell scripting

Posted 3 weeks ago

Apply

7.0 - 11.0 years

20 - 35 Lacs

Gandhinagar, Ahmedabad

Hybrid

Naukri logo

Job Title: Senior Data Engineer Experience: 8 to 10 Years Location: Ahmedabad & Gandhinagar Employment Type: Full-time Our client is a leading provider of advanced solutions for capital markets, specializing in cutting-edge trading infrastructure and software. With a global presence and a strong focus on innovation, the company empowers professional traders, brokers, and financial institutions to execute high-speed, high-performance trading strategies across multiple asset classes. Their technology is known for its reliability, low latency, and scalability, making it a preferred choice for firms seeking a competitive edge in dynamic financial environments. Role & responsibilities Design, develop, and maintain scalable and reliable data pipelines using DBT and Airflow . Work extensively with Snowflake to optimize data storage, transformation, and access. Develop and maintain efficient ETL/ELT processes in Python to support analytical and operational workloads. Ensure high standards of data quality, consistency, and security across systems. Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions. Monitor and troubleshoot data pipelines, resolving issues proactively. Optimize performance of existing data workflows and recommend improvements. Document data engineering processes and solutions effectively. Preferred candidate profile Bachelors or Masters degree in Computer Science, Engineering, or related field 8 - 10 years of experience in data engineering or related roles Strong knowledge of SQL and data warehousing principles Familiarity with version control (e.g., Git) and CI/CD practices Excellent problem-solving skills and attention to detail Strong communication and collaboration abilities Preferred Skills Experience in cloud platforms like AWS, GCP, or Azure Exposure to data governance and security best practices Knowledge of modern data architecture and real-time processing frameworks Competitive Benefits Offered By Our Client: Relocation Support: Our client offers an additional relocation allowance to assist with moving expenses. Comprehensive Health Benefits: Including medical, dental, and vision coverage. Flexible Work Schedule: Hybrid work model with an expectation of just 2 days on-site per week. Generous Paid Time Off (PTO): 21 days per year, with the ability to roll over 1 day into the following year. Additionally, 1 day per year is allocated for volunteering, 2 training days per year for uninterrupted professional development, and 1 extra PTO day during milestone years. Paid Holidays & Early Dismissals: A robust paid holiday schedule with early dismissal on select days, plus generous parental leave for all genders, including adoptive parents. Tech Resources: A rent-to-own program offering employees a company-provided Mac/PC laptop and/or mobile phone of their choice, along with a tech accessories budget for monitors, headphones, keyboards, and other office equipment. Health & Wellness Subsidies: Contributions toward gym memberships and health/wellness initiatives to support your well-being. Milestone Anniversary Bonuses: Special bonuses to celebrate key career milestones. Inclusive & Collaborative Culture: A forward-thinking, culture-based organisation that values diversity and inclusion and fosters collaborative teams.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

20 - 35 Lacs

Noida

Remote

Naukri logo

Job posting Title: Senior Data Engineer with Looker experience Relevant Experience: 5 to 10 years Key Skills: Looker Dashboard, LookML, DBT, GCP and Bigquery Location: Remote/Work from Home Note: We need to fill this position on priority, hence candidates with shorter notice period/Immediate joining will be given preference. Embark on an exciting journey into the realm of software development with 3Pillar! We extend an invitation for you to join our team and gear up for a thrilling adventure. At 3Pillar, our focus is on leveraging cutting-edge technologies that revolutionize industries by enabling data driven decision making. As a Data Engineer, you will hold a crucial position within our dynamic team, actively contributing to thrilling projects that reshape data analytics for our clients, providing them with a competitive advantage in their respective industries. If your passion for data analytics solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! Key Responsibilities Understanding the business requirements Design, develop, and maintain data pipelines using dbt to transform and load data into a data warehouse (e.g., Snowflake, BigQuery). Create and manage data models within dbt to ensure data consistency and accuracy for analysis. Build and maintain Looker dashboards and explore, translating complex data into intuitive visualizations for business users. Collaborate with business stakeholders to understand data requirements and translate them into effective data visualizations within Looker. Monitor data quality and implement data quality checks to ensure data integrity. Optimize data pipelines for performance and scalability. Troubleshoot data issues and resolve data quality discrepancies. Implement data governance practices to ensure data security and compliance. Minimum Qualifications: Demonstrated expertise with a minimum of 5+ years of experience as data engineer or similar role Advanced SQL skills and experience with relational and non-relational databases Deep understanding of data warehousing concepts Extensive experience with dbt (data build tool) Expertise in Looker dashboard development and administration Should have worked on Google Cloud services like Big Query, DataFlow and Storage Strong Python/Pyspark skills with hands-on experience Strong exposure to write ETL Proven track record of designing and implementing data pipelines using dbt Experience building and maintaining Looker dashboards and explores Familiarity with data governance practices and data security protocols Experience in Data collection from multiple sources and integrating based on business needs If Interested, please share the resume at Kiran.Dhanak@3pillarglobal.com or call on 96533-53608. Regards Kiran Dhanak TA Team 3Pillar

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Hybrid

Naukri logo

Duration: 8Months Work Type: Onsite Position Description: Looking for qualified Data Scientists who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, and Optimization. Potential candidates should have hands-on experience in applying first principles methods, machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem-solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Skills Required: Machine Learning, GenAI, LLM Skills Preferred: Python, Google Cloud Platform, Big Query Experience Required: 3+ years of hands-on experience in using machine learning/text mining tools and techniques such as Clustering/classification/decision trees, Random forests, Support vector machines, Deep Learning, Neural networks, Reinforcement learning, and other numerical algorithms Experience Preferred: 3+ years of experience in at least one of the following languages: Python, R, MATLAB, SAS Experience with GoogleCloud Platform (GCP) including VertexAI, BigQuery, DBT, NoSQL database and Hadoop Ecosystem Education Required: Bachelor's Degree

Posted 3 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a Senior Data Engineer who will design, build, and maintain scalable data pipelines and ingestion frameworks. The ideal candidate must have experience with DBT, orchestration tools like Airflow or Prefect, and cloud platforms such as AWS. Responsibilities include developing ELT pipelines, optimizing queries, implementing CI/CD, and integrating with AWS services. Strong SQL, Python, and data modeling skills are essential. The role also involves working with real-time and batch processing, ensuring high performance and data integrity.

Posted 3 weeks ago

Apply

6.0 - 9.0 years

15 - 30 Lacs

Noida, Pune, Chennai

Work from Office

Naukri logo

Job Description candidate Must Have Experience: SQL Dbt Snowflake Data Quality & Data modelling Snowpipe, Fivetran Candidates can share resumes at deepali.rawat@rsystems.com Roles & Responsibilities 1. Ensure reliable and scalable data pipelines to support healthcare operations. 2. Maintain data availability with proactive exception handling and recovery mechanisms. 3. Perform data quality checks to ensure accuracy, completeness, and consistency. 4. Detect and handle alerts early to prevent data discrepancies and processing failures. 5. Develop and optimize data models for efficient storage, retrieval, and analytics. 6. Prepare and structure data for reporting, compliance, and decision-making. 7. Work with Snowflake to manage data warehousing and performance tuning. 8. Implement and optimize DBT workflows for data transformation and governance. 9. Leverage SQL and Python for data processing, automation, and validation. 10. Experience with Snowpipe and Fivetran is a plus for automating data ingestion.

Posted 3 weeks ago

Apply

8.0 - 9.0 years

8 - 9 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams. Experience: 8 - 9 years Salary: Not Disclosed Location: Gurugram

Posted 3 weeks ago

Apply

11.0 - 12.0 years

11 - 12 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams.

Posted 3 weeks ago

Apply

2 - 3 years

10 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

Role & responsibilities Develop and maintain ETL pipelines using Python & SQL. Work with Airflow for workflow orchestration. Implement data transformations using dbt. Optimize database performance and manage data warehousing. Collaborate on scalable data solutions using C#/Java. Preferred candidate profile 2-3 years of experience in data engineering. • Strong expertise in SQL, Python, and ETL processes. • Hands-on experience with Airflow, dbt, and modern data platforms. • Knowledge of cloud data services (AWS/GCP/Azure) is a plus.

Posted 2 months ago

Apply

10 - 19 years

15 - 25 Lacs

Bengaluru

Remote

Naukri logo

We are seeking a skilled Architect with expertise in Azure DevOps automation, focusing on Snowflake and DBT (Data Build Tool). The ideal candidate will be responsible for designing, implementing, and maintaining automated workflows and pipelines to support efficient and scalable data platform solutions. This role requires strong technical expertise in DevOps practices, automation, and cloud-based data technologies. Key Responsibilities: • Design and implement Azure DevOps pipelines and workflows to automate Snowflake and DBT processes. • Develop and maintain CI/CD pipelines for data transformation, integration, and deployment. • Collaborate with data engineers, analysts, and stakeholders to understand requirements and deliver efficient solutions. • Ensure the scalability, reliability, and security of automated processes and workflows. • Monitor and troubleshoot pipeline performance, identifying and resolving bottlenecks or issues. • Develop and maintain technical documentation for workflows, best practices, and configurations. • Stay updated with industry trends and emerging tools to enhance automation capabilities. Required Skills and Qualifications: • Proven experience as an Architect or Senior Engineer specializing in Azure DevOps automation. • In-depth knowledge of Snowflake architecture and its integrations. • Hands-on experience with DBT for data transformation and modeling. • Proficiency in scripting languages (Python, PowerShell, etc.) for automation. • Strong understanding of CI/CD principles and best practices. • Experience with version control systems like Git. • Familiarity with cloud-based data platforms and services. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration abilities. Preferred Qualifications: • Certifications in Azure or Snowflake are a plus. • Experience with other data tools and platforms is advantageous.

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Chennai, Bengaluru, Noida

Hybrid

Naukri logo

We are hosting an Open Walk-in Drive in Pune on 5th April [Saturday] 2025. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience 5 years to 10 years Time: 9.30 AM to 4:00 PM Point of Contact: Aishwarya G / aishwaryag5@hexaware.com Venue: Hexaware Technologies Ltd, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pune, Pimpri-Chinchwad, Maharashtra 411057 Key Skills and Experience: Must have 5 - 10 years of experience in Data warehouse, ETL, BI projects Must have atleast 4+ years of experience in Snowflake Expertise in Snowflake architecture is must. Must have atleast 3+ years of experience and strong hold in Python/PySpark Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Good to have experience with AWS services and creating DevOps templates for various AWS services. Experience in using Github, Jenkins Good communication and Analytical skills Snowflake certification is desirable What to Bring: Updated resume Photo ID, Passport size photo Mention "Aishwarya G" at the top of your resume. How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at aishwaryag5@hexaware.com We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Pune, Coimbatore, Mumbai (All Areas)

Hybrid

Naukri logo

We are hosting an Open Walk-in Drive in Pune on 5th April [Saturday] 2025. Details of the Walk-in Drive: Date: 5th April [Saturday] 2025 Experience 5 years to 10 years Time: 9.30 AM to 4:00 PM Point of Contact: Aishwarya G / aishwaryag5@hexaware.com Venue: Hexaware Technologies Ltd, Phase 3, Hinjewadi Rajiv Gandhi Infotech Park, Hinjewadi, Pune, Pimpri-Chinchwad, Maharashtra 411057 Key Skills and Experience: Must have 5 - 10 years of experience in Data warehouse, ETL, BI projects Must have atleast 4+ years of experience in Snowflake Expertise in Snowflake architecture is must. Must have atleast 3+ years of experience and strong hold in Python/PySpark Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Good to have experience with AWS services and creating DevOps templates for various AWS services. Experience in using Github, Jenkins Good communication and Analytical skills Snowflake certification is desirable What to Bring: Updated resume Photo ID, Passport size photo Mention "Aishwarya G" at the top of your resume. How to Register: To express your interest and confirm your participation, please reply to this email with your updated resume attached. Walk-ins are also welcome on the day of the event. This is an excellent opportunity to showcase your skills, network with industry professionals, and explore the exciting possibilities that await you at Hexaware Technologies. If you have any questions or require further information, please feel free to reach out to me at aishwaryag5@hexaware.com We look forward to meeting you and exploring the potential of having you as a valuable member of our team. ********* less than 4 years of total experience will not be Screen selected to attend the interview***********

Posted 2 months ago

Apply

2 - 6 years

9 - 12 Lacs

Pune

Work from Office

Naukri logo

Data Warehouse Developer to design, develop, and deploy ETL pipelines that meet client requirements. Proficiency in SQL, ETL tools and cloud platforms . Strong analytical, problem-solving, and communication skills are essential.

Posted 2 months ago

Apply

4 - 9 years

7 - 12 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Job Summary: Were looking for an experienced professional with strong expertise in Snowflake (AWS), Airflow, DBT, Python, and SQL to develop and optimize scalable data solutions. The ideal candidate will have a deep understanding of data warehousing, ETL/ELT pipelines, cloud platforms, and analytics reporting. This role requires hands-on experience in building, managing, and optimizing data pipelines while ensuring data integrity, security, and compliance. Key Responsibilities: Design, develop, and optimize Snowflake-based data solutions. Implement and manage ETL/ELT workflows using DBT, Airflow, Informatica, Pentaho, or Fivetran. Write and optimize SQL queries for efficient data retrieval and transformation. Work with AWS cloud services (Lambda, S3, SNS/SQS, EC2) for data automation and integration. Develop and maintain data pipelines to support analytics and reporting needs. Ensure data quality, transformation, normalization, and aggregation as per business requirements. Perform query performance tuning and troubleshooting in production environments. Support CI/CD deployments, change management, and root cause analysis (RCA). Develop functional business metrics across domains such as finance, retail, and telecom. Collaborate with cross-functional teams to ensure data security, compliance, and governance. Qualifications & skills: Mandatory Skills: Snowflake (AWS): 4+ years of experience in advanced SQL and Snowflake development. Airflow: Experience in workflow orchestration and scheduling. DBT (Data Build Tool): Hands-on expertise in data transformation and modeling. Python: 3+ years of experience in advanced scripting and automation. SQL: Strong query optimization and data processing skills. Technical Skills: Data Warehousing: 4+ years of experience in data modeling, star schema, normalization/denormalization. ETL/ELT Development: 3+ years of experience in DBT, Informatica, Pentaho, or Fivetran. Cloud Platforms: 3+ years of hands-on experience with AWS or any cloud environment. Data Analytics & Reporting: 4+ years of experience in data profiling, metric development, and performance tuning. Soft Skills: Strong written and verbal communication skills for stakeholder collaboration. Ability to work in a team environment and support cross-functional projects. Experience working in enterprise environments, following best practices for CI/CD, security, change management, RCA, and on-call rotations. Preferred Qualifications: Technical Certifications in AWS / Snowflake. 4+ years of experience in ETL/ELT tools (DBT, Informatica, FiveTran). 4+ years of experience in industry-specific metric development (finance, retail, telecom). Team leadership experience and exposure to large-scale data support environments. Location-Hyderabad,Bengaluru,Chennai,Pune,Kolkata

Posted 2 months ago

Apply

4 - 9 years

6 - 11 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Job Summary: Were looking for an experienced professional with strong expertise in Snowflake (AWS), Airflow, DBT, Python, and SQL to develop and optimize scalable data solutions. The ideal candidate will have a deep understanding of data warehousing, ETL/ELT pipelines, cloud platforms, and analytics reporting. This role requires hands-on experience in building, managing, and optimizing data pipelines while ensuring data integrity, security, and compliance. Key Responsibilities: Design, develop, and optimize Snowflake-based data solutions. Implement and manage ETL/ELT workflows using DBT, Airflow, Informatica, Pentaho, or Fivetran. Write and optimize SQL queries for efficient data retrieval and transformation. Work with AWS cloud services (Lambda, S3, SNS/SQS, EC2) for data automation and integration. Develop and maintain data pipelines to support analytics and reporting needs. Ensure data quality, transformation, normalization, and aggregation as per business requirements. Perform query performance tuning and troubleshooting in production environments. Support CI/CD deployments, change management, and root cause analysis (RCA). Develop functional business metrics across domains such as finance, retail, and telecom. Collaborate with cross-functional teams to ensure data security, compliance, and governance. Qualifications & skills: Mandatory Skills: Snowflake (AWS): 4+ years of experience in advanced SQL and Snowflake development. Airflow: Experience in workflow orchestration and scheduling. DBT (Data Build Tool): Hands-on expertise in data transformation and modeling. Python: 3+ years of experience in advanced scripting and automation. SQL: Strong query optimization and data processing skills. Technical Skills: Data Warehousing: 4+ years of experience in data modeling, star schema, normalization/denormalization. ETL/ELT Development: 3+ years of experience in DBT, Informatica, Pentaho, or Fivetran. Cloud Platforms: 3+ years of hands-on experience with AWS or any cloud environment. Data Analytics & Reporting: 4+ years of experience in data profiling, metric development, and performance tuning. Soft Skills: Strong written and verbal communication skills for stakeholder collaboration. Ability to work in a team environment and support cross-functional projects. Experience working in enterprise environments, following best practices for CI/CD, security, change management, RCA, and on-call rotations. Preferred Qualifications: Technical Certifications in AWS / Snowflake. 4+ years of experience in ETL/ELT tools (DBT, Informatica, FiveTran). 4+ years of experience in industry-specific metric development (finance, retail, telecom). Team leadership experience and exposure to large-scale data support environments. Location : - Hyderabad,Bengaluru,Chennai,Pune,Kolkata

Posted 2 months ago

Apply

5 - 8 years

13 - 19 Lacs

Pune

Work from Office

Naukri logo

CANDIDATE SHOULD BE AVAILABLE FOR FACE TO FACE INTERVIEW ON 12TH APRIL AND SHOULD BE READY TO WORK FROM OFFICE 5 DAYS A WEEK A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and architectures that enable the collection, transformation, and storage of large datasets. Ensure data quality and reliability, support data-driven decision-making, and facilitate the integration of various data sources into centralized systems. Develop and manage data pipelines for extracting, loading, and transforming data from multiple sources. Work with open-source and cloud-based databases (e.g., PostgreSQL, Snowflake, BigQuery, Redshift). Automate database operations and ETL tasks using programming languages such as Python and frameworks like Spark. Implement CI/CD practices and version control to streamline deployments. Ensure efficient and reliable orchestration using tools like Apache Airflow, Prefect, or Dragster. Experience working on API integration and real time streaming. Database: Preferred (PostgresQL, MongoDB),or any RDBMS,NOSQL databases Programming languages : Python, Spark, SQL, DBT orchestration : Apache Airflow, Prefect, NIFI Cloud tech : AWS (e.g., S3, Redshift, Glue), Google Cloud Platform (e.g., BigQuery, Cloud Composer) Streaming : Apache Kafka / google cloud pub/sub Devops : Docker / Kubernetes , GIT,

Posted 2 months ago

Apply

5 - 10 years

20 - 35 Lacs

Chennai

Hybrid

Naukri logo

Design, build, and test end-to-end data pipelines for Data Ingestion, Integration, and Curation. Implement automation for data workflows and testing to enhance efficiency and reliability with data quality checks. Continuously optimize data pipelines to support diverse workloads and business requirements, improving performance over time. Create, maintain, and scale our cloud-based data platform ensuring high availability and scalability. Address and minimize technical debt to ensure a streamlined and efficient codebase. Develop reusable frameworks and components to streamline data engineering processes and boost development efficiency. Lead key data engineering projects from inception to successful delivery, taking ownership and ensuring completion with minimal oversight. Collaborate closely with analysts, business partners, and other stakeholders to understand data requirements and deliver high-quality solutions. Document data processes, workflows, and systems comprehensively to maintain clarity and transparency. What You Bring to the Table: Experience with data processing tools such as Snowflake, DBT, Databricks, Azure (ADF and Fabric), and GCP - BigQuery. Expertise in database optimization (partitioning, group and sort keys, indexes, query optimization). Hands-on coding experience with languages like Python, PySpark, and SQL to access, extract, manipulate, and summarize data. Programming and/or scripting experience: Python, PySpark. Experience in automation and testing of data workflows, preferably Azure ADF. Familiarity with a broad base of analytical methods like data modeling, variable-based transformation & summarization, and algorithmic development. Whats needed- Basic Qualifications: 6+ years of hands-on experience in designing solutions on the cloud computing platforms like Snowflake, Microsoft Azure or Google Cloud. 8+ years of experience with cloud-based databases, preferably Snowflake or Google BigQuery. 7+ years of experience developing scalable solutions using Python or PySpark. 8+ years of working experience in writing complex SQL’s. 8+ years of working in Agile environments. 5+ years of experience in CI/CD IMMEDIATE- SERVING NP until month end

Posted 2 months ago

Apply

9 - 14 years

30 - 40 Lacs

Chennai, Pune, Bengaluru

Work from Office

Naukri logo

Position: Integration Architect (DBT+ Snowflake) Location: Pune/Chennai/Nagpur/Bengaluru Purpose of the Position: As a Senior Data Integration Developer/ Architect (DBT), this role seeks candidates passionate about specialized skills in Snowflake technology and features. You will be instrumental in assisting our clients by developing models that facilitate their advancement in utilizing Snowflake effectively. Key Result Areas and Activities: Expertise and Knowledge Sharing: Develop and share expertise in DBT & Snowflake Data Modelling and Development. Actively mine and disseminate organizational experience and expertise across teams and clients. 2.Support and Collaboration: Support Cloud and Data Engineering COE initiatives. Collaborate with management to understand and align with company objectives. 3.Real-Time Data and Performance: Ensure DBT solutions are correctly built for collecting real-time data. Perform and deliver effectively in large and complex environments. 4 .Pipeline and Architecture Design: Design, build, test, and maintain Snowflake architectures and data pipelines. 5.Compliance and Security: Ensure compliance with data governance and security policies. Must have: Expertise in Snowflake architecture, understanding of models, cloud platforms integration with snowflake ETL Development, SQL Scripting, working knowledge of stored procedures Proficiency in designing and maintaining data warehouses and data marts. Strong skills in ETL processes and tools (e.g., Informatica, Talend, Snaplogic). Strong problem-solving skills and the ability to work effectively in a collaborative team environment. Experience working on Datawarehouse/ETL projects. 4+ years of Snowflake ETL experience and 3+ DBT experience or equivalent Experience with cloud data platforms (e.g., AWS, Azure)

Posted 2 months ago

Apply

2 years

0 Lacs

Chennai

Work from Office

Naukri logo

Job Description: Data Engineer Position Details Position Title: Data Engineer Department: Data Engineering Location: Chennai Employment Type: Full-Time About the Role We are seeking a highly skilled and motivated Data Engineer with expertise in Snowflake to join our dynamic team. In this role, you will design, build, and optimize scalable data pipelines and cloud-based data infrastructure to ensure efficient data flow across systems. You will collaborate closely with data scientists, analysts, and business stakeholders to provide clean, accessible, and high-quality data for analytics and decision-making. The ideal candidate is passionate about cloud data platforms, data modeling, and performance optimization , with hands-on experience in Snowflake and modern data engineering tools . Key Responsibilities 1. Data Pipeline Development & Optimization Design, develop, and maintain scalable ETL/ELT data pipelines using Snowflake, dbt, and Apache Airflow . Optimize Snowflake query performance, warehouse sizing, and cost efficiency . Automate data workflows to ensure seamless integration between structured and unstructured data sources. 2. Data Architecture & Integration Design and implement data models and schemas optimized for analytics and operational workloads. Manage Snowflake multi-cluster warehouses, role-based access controls (RBAC), and security best practices . Integrate data from multiple sources, including APIs, relational databases, NoSQL databases, and third-party services . 3. Infrastructure & Performance Management Monitor and optimize Snowflake storage, query execution plans, and resource utilization . Implement data governance, security policies, and compliance within Snowflake. Troubleshoot and resolve performance bottlenecks in data pipelines and cloud storage solutions . 4. Collaboration & Continuous Improvement Work with cross-functional teams to define data requirements and ensure scalable solutions . Document technical designs, architecture, and processes for data pipelines and Snowflake implementations . Stay updated with the latest advancements in cloud data engineering and Snowflake best practices . Qualifications Education & Experience Bachelors or Master’s degree in Computer Science, Information Technology, Engineering , or a related field. 2+ years of experience in data engineering , with a strong focus on cloud-based data platforms . Proven expertise in Snowflake , including performance tuning, cost management, and data sharing capabilities . Experience working with cloud platforms (AWS, GCP, or Azure) and distributed computing frameworks (Spark, Hadoop, etc.) . Technical Skills Strong SQL skills for query optimization and data modeling in Snowflake. Experience with ETL tools such as Apache Airflow, dbt, Talend, Informatica, or Matillion . Proficiency in Python, Scala, or Java for data processing and automation . Familiarity with Kafka, Kinesis, or other streaming data solutions . Understanding of data warehousing concepts, partitioning, and indexing strategies . Preferred Qualifications SnowPro Certification or an equivalent cloud data engineering certification . Experience with containerization (Docker, Kubernetes) and CI/CD for data workflows . Knowledge of machine learning pipelines and MLOps . Benefits Competitive salary and performance-based bonuses . Health insurance . Flexible working hours and remote work options . Professional development opportunities , including Snowflake training, certifications, and conferences . Collaborative and inclusive work environment . How to Apply Follow these steps to apply for the Data Engineer position: 1. Submit Your Resume/CV Ensure your resume is updated and highlights your relevant skills, experience, and achievements in Data Engineering. 2. Write a Cover Letter (Optional but Recommended) Your cover letter should include: Why you are interested in the role. Your relevant experience and achievements in data engineering . How your skills align with the job requirements . 3. Provide Supporting Documents (Optional but Recommended) Links to GitHub repositories, research papers, or portfolio projects showcasing your work in data engineering. If you don’t have links, you can attach files (e.g., PDFs) of your projects or research papers. 4. Send Your Application Email your resume, cover letter, and supporting documents to: krishnamoorthi.somasundaram@nulogic.io

Posted 2 months ago

Apply

2 - 6 years

9 - 19 Lacs

Pune, Bengaluru, Noida

Work from Office

Naukri logo

0 Requirement 1 Location Bangalore/Pune/Noida Contract Experience 3 to 6 Years. Job Description – GCP Data Engineer Skills required : Bigdata Workflows (ETL/ELT), Python hands-on, SQL hands-on, Any Cloud (GCP & BigQuery preferred), Airflow (good knowledge on Airflow features, operators, scheduling etc) Skills that would add advantage : DBT, Kafka Experience level : 4 – 5 years NOTE – Candidate will be having the coding test (Python and SQL) in the interview process. This would be done through coder’s-pad. Panel would set it at run-time. Requirement 2 DBT Data Engineer – Job location – Bangalore/Pune/Noida Experience - 2 to 4 Years The Data Engineer that will fit has at least 2 years hands on experience in this role, most of it in e-commerce companies. We expect them to have been focusing on building data pipelines: ETL/ELT processes, modelling data in dbt to clean and process it, connecting to APIs to pull and push data to peripheral applications, working with python, automating/optimising data infrastructures. They will work as part of a team of 6: senior and junior data engineers, and analysts. Our infrastructure is built on Google Cloud Platform: we use Python, data streaming, Google BigQuery, dbt , Prefect and Tableau. We are introducing AI features and predictive functionality in this environment. Tasks: Design, develop, and maintain scalable data pipelines, ETL processes, and data integration workflows using Google BigQuery, dbt, Prefect, and Table Create and implement data models and machine learning algorithms for efficient d transformation Identify and troubleshoot data-related issues, proposing and implementing effecti solutions Connect to peripheral applications: Exponea/Bloomreach, Salesforce, Odoo, Magento to send and receive data. Requirements: Minimum of 2 years of experience as data engineer, handling e-commerce data projects and building data pipelines. Minimum a Bachelor's degree, ideally in Engineering or Science. Strong proficiency in Python and SQ Proven experience in building ETL/ELT processes. Proven experience with data modelling, ideally in dbt. Proven experience with Google BigQuery. Ability to understand the broader technical infrastructure and the role of the data platform within it. Familiarity with Bloomreach Exponea, Salesforce, Tableau, Magento, and Odoo will be advantageous Requirement 3

Posted 2 months ago

Apply

7 - 12 years

15 - 25 Lacs

Delhi NCR, Bengaluru, Hyderabad

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer (Snowflake+Python+Cloud)! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Good to have DBT experience Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies