Manager, Data Engineering

12 - 15 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Manager, Data Engineering


Pune, Maharashtra, India


Full-time


Region : India


Position Overview :


We're seeking a Manager - Data Engineering to lead our growing data engineering team based in Pune. This leadership role will focus on building scalable, secure, and high-performance data platforms, pipelines, and products for our global, multi-tenant SaaS applications.


You will work closely with cross-functional teams including Product, Architecture, UX, QA, and DevOps to deliver robust data solutions that power products and reporting systems - especially in the areas of ETL pipelines, cloud-native data warehousing, and unstructured data lakes.


 Key Responsibilities

👥 Team Leadership & Development

  • Lead, mentor, and retain a high-performing team of data engineers
  • Conduct regular 1:1s, performance reviews, and growth planning
  • Foster a collaborative team culture and instill best practices

📦 Project & Delivery Management

  • Drive delivery of data solutions aligned with sprint and release goals
  • Ensure on-time delivery, high code quality, and scalability
  • Facilitate agile ceremonies: sprint planning, retrospectives, and stand-ups

🧩 Technical Execution & Architecture

  • Architect and guide the development of scalable ETL/ELT pipelines
  • Build and maintain data lake solutions using AWS tools to manage unstructured and semi-structured data
  • Work with large-scale datasets from diverse sources including APIs, logs, files, and internal systems
  • Optimize performance, security, and maintainability of data pipelines
  • Promote usage of tools such as Snowflake, dbt, Python, and SQL

🔐 Data Governance & Best Practices

  • Ensure adherence to internal coding standards and data security guidelines
  • Implement best practices for data modeling, quality checks, and documentation
  • Collaborate with architecture and infrastructure teams on cloud cost optimization and performance


✅ Required Skills & Experience

  • 12- 15 years in Data Engineering, Data Architecture, or similar roles
  • Minimum 3 years in a leadership or managerial capacity
  • Proven experience building robust ETL pipelines, preferably for multi-tenant SaaS platforms
  • Strong technical hands-on expertise with:
  • AWS Services: S3, Glue, Lambda, Redshift, EMR
  • Data Platforms: Snowflake, dbt
  • Programming: Python, SQL
  • Unstructured Data Handling: Data lakes, JSON, XML, log data
  • Expertise in SQL-based data warehousing and RDBMS systems
  • Knowledge of CI/CD, version control (GitHub), and Agile/Scrum methodologies
  • Ability to balance technical depth with stakeholder communication and delivery tracking
  • Understanding of modern lakehouse architecture and tools like Apache Hudi, Iceberg, or Delta Lake.


⭐ Good to Have

  • Experience in product-based companies (SaaS, ESG, Supply Chain domains preferred)
  • Familiarity with data security standards (e.g., GDPR, SOC2)
  • Experience with orchestration tools (Airflow, Step Functions), data cataloging, or cost optimization


📩 Interested candidates can share their CVs at:

geet.prometheus@gmail.com


Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You