ETL Data Engineer

5 - 10 years

7 - 11 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are looking for a passionate and experienced Data Engineer to join our team and help build scalable, reliable, and efficient data pipelines on cloud platforms like Primarily on Google Cloud Platform (GCP) and secondary on Amazon Web Services (AWS). You will work with cutting-edge technologies to process structured and unstructured data, enabling data-driven decision-making across the organization.

Key Responsibilities
  • Design, develop, and maintain robust data pipelines and ETL/ELT workflows using PySpark, Python, and SQL.
  • Build and manage data ingestion and transformation processes from various sources including Hive, Kafka, and cloud-native services.
  • Orchestrate workflows using Apache Airflow and ensure timely and reliable data delivery.
  • Work with large-scale big data systems to process structured and unstructured datasets.
  • Implement data quality checks, monitoring, and alerting mechanisms.
  • Collaborate with cross-functional teams including data scientists, analysts, and product managers to understand data requirements.
  • Optimize data processing for performance, scalability, and cost-efficiency.
  • Ensure compliance with data governance, security, and privacy standards.
Required Skills & Qualifications
  • 5+ years of experience in data engineering or related roles.
  • Strong programming skills in Python and PySpark.
  • Proficiency in SQL and experience with Hive.
  • Hands-on experience with Apache Airflow for workflow orchestration.
  • Experience with Kafka for real-time data streaming.
  • Solid understanding of big data ecosystems and distributed computing.
  • Experience with GCP (BigQuery, Dataflow, Dataproc
  • Ability to work with both structured (e.g., relational databases) and unstructured (e.g., logs, images, documents) data.
  • Familiarity with CI/CD tools and version control systems (e.g., Git).
  • Knowledge of containerization (Docker) and orchestration (Kubernetes).
  • Exposure to data cataloging and governance tools (e.g., AWS Lake
  • Formation, Google Data Catalog).
  • Understanding of data modeling and architecture principles.
Soft Skills
  • Strong analytical and problem-solving abilities.
  • Excellent communication and collaboration skills.
  • Ability to work in Agile/Scrum environments.
  • Ownership mindset and attention to detail.

Skill to Evaluate

Job Category:

ETL Data Engineer

Job Type:

Full Time

Job Location:

On Site

Experience:

5+ Years
Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

noida, uttar pradesh, india