Data Engineer - Python/ETL

58 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About The Role

We are seeking an experienced Data Engineer - Python with 58 years of hands-on expertise in building scalable data solutions. The ideal candidate will design, develop, and optimize ETL pipelines, ensure data quality and reliability, and collaborate with cross-functional teams to enable data-driven decision-making.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines using Python, PySpark, and SQL.
  • Build and optimize scalable data pipelines leveraging AWS services (Glue, Lambda, S3, Athena, Step Functions).
  • Implement and manage Data Warehousing solutions with strong knowledge of SCD (Type 1, Type 2) and Medallion Architecture.
  • Develop efficient data models and ensure partitioning, indexing, and performance optimization in big data environments.
  • Ensure high standards of data quality, governance, and security across pipelines and platforms.
  • Collaborate with data scientists, analysts, and business stakeholders to translate business requirements into technical solutions.
  • Monitor and troubleshoot production pipelines to ensure reliability and scalability.
  • Contribute to automation, process improvements, and documentation for data engineering workflows.

Required Skillsets

  • 5-8 years of proven experience in Data Engineering.
  • Strong proficiency in Python, PySpark, and SQL for data processing and Solid understanding of ETL/ELT design principles and experience with AWS services (Glue, Lambda, S3, Athena, Step Functions).
  • Hands-on experience with Data Warehousing concepts, SCD (Type 1, Type 2), and Medallion Architecture.
  • Expertise in data modeling, partitioning strategies, and query performance tuning.
  • Strong problem-solving and debugging skills in big data environments.
  • Excellent communication skills to explain technical concepts to non-technical stakeholders.

Nice-to-Have Skills

  • Experience with DBT for data transformation and testing.
  • Exposure to Databricks and the Lakehouse architecture.
  • Familiarity with CI/CD for data pipelines and infrastructure-as-code (Terraform, Knowledge of data security, compliance, and governance best practices.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You