Director / Associate Director – Data Engineering

25 years

30 - 50 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Experience: 15–25 YearsLocations: Hyderabad, Pune, BangaloreWork Mode: Hybrid

Role Overview

We are hiring a Director / Associate Director – Data Engineering to lead the architecture, delivery, and team management across multiple data engineering engagements. The ideal candidate will bring deep expertise in designing modern data platforms, building scalable data pipelines, and leading teams across technologies and cloud environments (AWS, Azure, or GCP).This is a client-facing leadership role focused on delivering enterprise-scale data engineering solutions and growing high-performing teams.

Key Responsibilities

  • Lead end-to-end data engineering strategy and solution delivery for multiple projects.
  • Collaborate with enterprise clients to define data architectures, platform modernization strategies, and transformation roadmaps.
  • Drive the design and implementation of cloud-native data pipelines, data lakes, and data warehouses.
  • Manage project budgets, resourcing, timelines, and stakeholder communications.
  • Build, lead, and mentor high-performing data engineering teams across cloud platforms.
  • Define and enforce engineering standards, governance, and best practices.
  • Ensure quality, scalability, and performance across all data solutions.

Required Skills & Qualifications

Experience & Leadership:
  • 10–15 years in data engineering or data platform development, including at least 3+ years in director-level or senior leadership roles.
  • Proven success in delivering large-scale, cloud-based data solutions.
  • Strong people leadership, client management, and delivery ownership experience.

Data Engineering & Tools

  • Strong experience building ETL/ELT pipelines using tools like Spark, PySpark, Python, SQL, Airflow, dbt, etc.
  • Solid knowledge of data lake/lakehouse/warehouse architectures.
  • Experience with batch and real-time streaming data processing.
  • Strong data modeling (star/snowflake schema), data partitioning, and performance tuning.

Cloud & Platform Expertise

  • Experience with any major cloud provider – AWS, Azure, or Google Cloud Platform (GCP).
  • Deep understanding of cloud-native data architecture, including storage, compute, networking, and security considerations.
  • Hands-on experience with services like:
    • AWS: Redshift, Glue, S3, Lambda, EMR, Kinesis
    • Azure: Data Lake, Synapse, ADF, Databricks, Event Hub
    • GCP: BigQuery, Dataflow, Pub/Sub, Composer

Governance & Quality

  • Knowledge of data cataloging, data quality, lineage, and governance frameworks.
  • Familiarity with security and compliance requirements (e.g., GDPR, HIPAA) in data platforms.

Preferred Qualifications

  • Certifications in cloud platforms (AWS, Azure, or GCP).
  • Experience with CI/CD pipelines, DevOps for data, and infrastructure as code (Terraform/CloudFormation).
  • Exposure to analytics and BI tools (Power BI, Tableau, Looker).
  • Familiarity with data mesh, data fabric, or modern data architecture patterns.
Skills: azure,spark,data quality,data warehouses,terraform,data governance,airflow,python,ci/cd,elt,architecture,dbt,leadership,data engineering,data modeling,data architecture,cloudformation,pyspark,cloud platforms,data lakes,sql,gcp,etl,pipelines,devops,cloud,aws

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You