Data Engineer

4 - 7 years

15 - 30 Lacs

Posted:3 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Key Responsibilities

Design, build, and manage scalable, reliable, and secure ETL/ELT data pipelines using tools such as Apache Spark, Apache Flink , Airflow, and Databricks.

Develop and maintain data architecture, ensuring efficient data modeling, warehousing, and data flow across systems.

Collaborate with data scientists, analysts, and business teams to understand data requirements and implement robust solutions.

Work with cloud platforms (AWS, Azure, or GCP) to build and optimize data lake and data warehouse environments (e.g., Redshift, Snowflake, BigQuery ).

Implement CI/CD pipelines for data infrastructure using tools such as Jenkins, Git, Terraform, and related DevOps tools.

Apply data quality and governance best practices to ensure accuracy, completeness, and consistency of data.

Monitor data pipelines, diagnose issues, and ensure data availability and performance.

Requirements

5- 8 years of proven experience in data engineering or related roles.

Strong programming skills in Python (including PySpark ) and SQL.

Experience with big data technologies such as Apache Spark, Apache Flink , Hadoop, Hive, and HBase.

Proficient in building data pipelines using orchestration tools like Apache Airflow.

Hands-on experience with at least one major cloud platform (AWS, Azure, GCP), including services like S3, ADLS, Redshift, Snowflake, or BigQuery .

Experience with data modeling, data warehousing, and real-time/batch data processing.

Familiarity with CI/CD practices, Git, and Terraform or similar infrastructure-as-code tools.

Ability to design for scalability, maintainability, and high availability.

Preferred Qualifications

Bachelors degree in Information Technology, Computer Information Systems, Computer Engineering, Computer Science

Certifications in cloud platforms (e.g., AWS Certified Data Analytics, Google Professional Data Engineer).

Experience with containerization tools such as Docker and orchestration with Kubernetes.

Workflows automation

Familiarity with machine learning pipelines and serving infrastructure.

Experience in implementing data governance, data lineage, and metadata management practices.

Exposure to modern data stack tools like dbt , Kafka, or Fivetran .

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Razorthink logo
Razorthink

Artificial Intelligence / SaaS

San Francisco

RecommendedJobs for You

gurugram, haryana, india

noida, uttar pradesh, india

hyderabad, telangana, india

noida, uttar pradesh, india

gurugram, haryana, india