6 - 9 years

24 - 29 Lacs

Posted:4 days ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Position Summary

We are seeking an experienced Python Data Engineer with 6–9 years of hands-on experience in designing, building, and optimizing scalable data pipelines and data platforms. The ideal candidate will have strong expertise in Python, cloud data services, distributed systems, and ETL frameworks. You will work closely with data scientists, analysts, and business stakeholders to ensure reliable, high-quality data delivery across the organization.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines using Python and modern data engineering tools.
  • Build and optimize data ingestion frameworks to integrate structured, semi-structured, and unstructured data.
  • Develop reusable Python libraries, utilities, and automation scripts for data processing.
  • Work with large datasets and implement best practices in data quality, metadata management, and logging.
  • Optimize pipeline performance, improve data reliability, and reduce processing latency.
  • Collaborate with cross-functional teams to understand data needs and translate them into technical solutions.
  • Implement and maintain data models, schemas, and data warehouse features.
  • Work with cloud platforms (AWS/Azure/GCP) to build scalable, cloud-native data systems.
  • Integrate data pipelines with workflow orchestration tools such as Airflow, Prefect, or Dagster.
  • Ensure compliance with security standards, data governance, and industry best practices.
  • Monitor, troubleshoot, and debug production data pipelines.

Python DE

Required Skills and Qualifications:

  • Strong proficiency in Python & Pyspark, for data engineering tasks (ETL/ELT, automation, data processing).
  • Solid experience with Azure Data Services:
  • Azure Synapse Analytics
  • Azure Data Lake Storage (Gen2)
  • Azure Logic Apps. Azure Data Factory, Azure Databricks
  • Azure DevOps
  • Good understanding of data warehouse design concepts (star/snowflake schema, fact/dimension tables).
  • Advanced skills in SQL and PL/SQL for writing queries, stored procedures, and optimizing performance.
  • Experience integrating data from multiple structured and unstructured sources.
  • Familiarity with data governance, security, and compliance in a cloud environment.

DevOps & CI/CD

  • Familiarity with Git, version control strategies, and CI/CD pipelines.
  • Experience with containerization (Docker) and basics of Kubernetes (optional).

Testing & Quality

  • Experience writing unit tests, integration tests, and ensuring data quality.
  • Knowledge of data validation tools (Great Expectations, Deequ, etc.) is a plus.

Job Types: Full-time, Permanent

Pay: ₹200,000.00 - ₹243,000.00 per month

Benefits:

  • Health insurance
  • Provident Fund

Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

chennai, tamil nadu, india

chennai, tamil nadu, india

chennai, tamil nadu, india

chennai, tamil nadu, india

pune, maharashtra, india

chennai, tamil nadu, india

gurugram, haryana, india