Home
Jobs

5 - 8 years

15 - 18 Lacs

Posted:3 weeks ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Key Responsibilities Design, develop, and maintain scalable ETL pipelines using PySpark . Build and orchestrate data workflows using Apache Airflow . Develop reusable Python modules for data ingestion and transformation. Collaborate with data scientists and analysts to understand data needs and build robust solutions. Optimize Spark jobs for performance and cost-efficiency. Monitor and troubleshoot data pipeline failures and latency issues. Required Skills Strong hands-on experience in Python programming. In-depth knowledge of PySpark and big data processing. Proficiency in developing and scheduling DAGs in Apache Airflow . Experience working with SQL , data lakes , and data warehouses . Familiarity with Git , Linux , and CI/CD tools.

Mock Interview

Practice Video Interview with JobPe AI

Start Airflow Interview Now

My Connections Advance Career Solutions

Download Chrome Extension (See your connection in the Advance Career Solutions )

chrome image
Download Now
Advance Career Solutions
Advance Career Solutions

Business Consulting and Services

Los Angeles

2-10 Employees

51 Jobs

    Key People

  • John Doe

    CEO
  • Jane Smith

    COO

RecommendedJobs for You

Hyderabad, Chennai, Bengaluru

Hyderabad, Chennai, Bengaluru