Data Engineer

4 - 8 years

18 - 25 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role & responsibilities

  • Design, develop, and maintain

    scalable data pipelines

    and

    ETL/ELT workflows

    using

    Databricks, Google BigQuery, SQL, and cloud-native tools

    .
  • Build and optimize

    batch and streaming data pipelines

    to support analytics, reporting, and business intelligence use cases.
  • Collaborate with

    business stakeholders, product teams, analytics engineers, and data analysts

    to gather requirements and deliver data solutions.
  • Develop and manage

    data models, schemas, and transformations

    ensuring data quality, integrity, and consistency.
  • Optimize

    SQL queries, partitioning, clustering, and indexing

    for performance and cost efficiency.
  • Support

    BI tools and dashboards

    by providing clean, reliable, and analytics-ready datasets.
  • Implement and monitor

    data quality checks, validation rules, and error handling

    across pipelines.
  • Troubleshoot and resolve

    data pipeline failures, performance issues, and data inconsistencies

    across environments (dev, test, prod).
  • Ensure compliance with

    data governance, data security, access controls, and privacy standards

    .
  • Work directly with

    clients and external stakeholders

    to gather requirements, present deliverables, and manage expectations.

Preferred candidate profile

  • 48 years of experience

    as a

    Data Engineer, Analytics Engineer, or ETL Developer

    .
  • Advanced proficiency in

    SQL

    (complex queries, window functions, optimization, performance tuning).
  • Strong hands-on experience with

    Google BigQuery

    (partitioning, clustering, cost optimization).
  • Experience building data pipelines using

    Databricks

    (Apache Spark, Delta Lake).
  • Solid understanding of

    ETL/ELT architecture, data warehousing, dimensional modeling, and star/snowflake schemas

    .
  • Experience with

    Python and/or Scala

    for data processing and automation.
  • Familiarity with

    cloud platforms

    such as

    Google Cloud Platform (GCP), Azure, or AWS

    .
  • Experience with

    pipeline orchestration tools

    (Airflow, Databricks Workflows, or similar) preferred.
  • Knowledge of

    data governance, security, IAM, and compliance frameworks

    is a plus.
  • Strong

    client-facing, communication, and problem-solving skills

    .
  • Ability to work independently in a

    hybrid or remote work environment

    .

Mock Interview

Practice Video Interview with JobPe AI

Start Data Engineer Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
7dxperts logo
7dxperts

IT Services and IT Consulting

London England

RecommendedJobs for You

noida, mumbai, pune, chennai, bengaluru

gurugram, chennai, bengaluru