3 - 8 years

15 - 25 Lacs

Posted:1 hour ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Title: Data Engineer

Location: Gurgaon

Experience Level: 3+ Years

Job Summary We are seeking a highly motivated Data Engineer with 3+ years of experience in building robust data pipelines, integrating APIs, and managing cloud-based data infrastructure. You will play a crucial role in designing and maintaining our data architecture to enable real-time analytics and business insights. This role requires proficiency in Python and SQL, familiarity with Airbyte, Airflow, and cloud platforms (preferably GCP), and a strong understanding of ETL/ELT workflows.

Key Responsibilities

Design, build, and maintain scalable and reliable data pipelines using Python.

Develop and manage ETL/ELT workflows using Airbyte and Apache Airflow.

Write and optimize complex SQL queries for data extraction, transformation, and reporting.

Integrate data from various third-party APIs and internal sources.

Collaborate with data analysts, data scientists, and product teams to deliver high-quality datasets.

Manage and optimize BigQuery data storage and schema design for performance and cost efficiency.

Monitor data quality and implement validation checks to ensure data integrity.

Maintain documentation for data pipelines, workflows, and data models.

Required Skills & Qualifications

Programming: Strong proficiency in Python for data pipeline development and automation.

SQL: Expertise in writing efficient queries for data manipulation and aggregation.

ETL Tools: Hands-on experience with Airbyte and Apache Airflow.

Data Warehousing/Databases: Experience with PostgreSQL,MongoDB, Google BigQuery or similar cloud-based data warehouses.

Data Integration: Familiarity with integrating data via RESTful APIs.

Visualization: Experience using Google Data Studio; familiarity with Power BI is a plus.

Version Control: Proficient in Git for code versioning and collaboration.

Preferred Skills

Cloud Platforms: Working knowledge of Google Cloud Platform (GCP),AWS.

Data Modeling: Understanding of dimensional modeling and performance optimization techniques.

Nice to Have

Experience in streaming data architecture (e.g., using Kafka or Pub/Sub).

Exposure to Terraform or Infrastructure as Code (IaC) tools.

Familiarity with data governance and compliance standards.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You