Senior Data Engineer

6 - 9 years

22 - 25 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

About the Role

Senior Data Engineer

Key Responsibilities

  • Design and implement scalable data pipelines

    using dbt-core, Python, and SQL to support analytics, reporting, and data science initiatives.
  • Design and optimize data models

    in Snowflake to support efficient querying and storage.
  • Development and maintenance of our data warehouse

    , ensuring data quality, governance, and performance.
  • Collaborate with cross-functional teams

    including data analysts, data architects, data scientists, and business stakeholders to understand data needs and deliver robust solutions.
  • Establish and enforce best practices

    for version control (Git), CI/CD pipelines, and data pipeline monitoring.
  • Mentor and guide junior data engineers

    , fostering a culture of technical excellence and continuous improvement.
  • Evaluate and recommend new tools and technologies to enhance the data platform.
  • Provide on-going support for the existing ELT/ETL processes and procedures.
  • Identify tools and technologies to be used in the project as well as reusable objects that could be customized for the project Coding.

Required Qualifications

  • Bachelor's degree in computer science or related field (16 years of formal education related to engineering)
  • 6+ years of experience in data engineering or a related field.
  • Expert-level proficiency in

    SQL

    and

    Python

    for data transformation and automation.
  • Experience with

    dbt-core

    for data modeling and transformation.
  • Strong hands-on experience in cloud platforms (

    Microsoft Azure)

    and cloud data platforms (

    Snowflake

    ).
  • Proficiency with

    Git

    and collaborative development workflows. Familiarity with

    Microsoft

    VSCode

    or similar IDEs. Knowledge of

    Azure DevOps

    or

    Gitlab

    development operations and job scheduling tools.
  • Solid understanding of modern

    data warehousing architecture

    , dimensional modeling, ELT/ETL frameworks and data modeling techniques.
  • Excellent communication skills and the ability to translate complex technical concepts to non-technical stakeholders.
  • Proven expertise in designing and implementing batch and streaming data pipelines to support near real-time and large-scale data processing needs.

Preferred Qualifications

  • Experience working in a cloud-native environment (AWS, Azure, or GCP).
  • Familiarity with data governance, security, and compliance standards.
  • Prior experience with Apache Kafka (Confluent).
  • Artificial Intelligence (AI) experience is a plus.
  • Hands-on experience with orchestration tools (e.g., Airflow, Prefect) is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

chennai, tamil nadu, india