Data Engineer

4 - 9 years

16 - 20 Lacs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

GCP Data Engineer

EXP: 4+ years

Location:

Walk-In Details

Date:

Address: Newmark House, 403, 4th Floor, Plot no. 56, Patrika Nagar, HITEC City, Hyderabad, Telangana 500081

Contact:

About the Role

Senior Data Engineers

Key Responsibilities

  • Design, develop, and maintain robust

    ELT/ETL pipelines

    on GCP using

    Dataflow (Apache Beam)

    ,

    Dataproc (Spark)

    , and

    Cloud Composer (Airflow)

    .
  • Model and optimize datasets in

    BigQuery

    (partitioning, clustering, materialized views, UDFs).
  • Build

    streaming and near-real-time ingestion

    pipelines using

    Pub/Sub

    ,

    Dataflow

    , and

    CDC frameworks

    .
  • Implement

    data quality checks, validation frameworks

    , and

    SLAs

    ; monitor pipelines using

    Cloud Monitoring

    and

    Cloud Logging

    .
  • Optimize

    performance and cost efficiency

    across GCS, Dataproc autoscaling, and BigQuery slot management.
  • Contribute to

    coding standards, CI/CD best practices, observability

    , and documentation; participate in

    peer code reviews

    .
  • Collaborate with

    Analytics, BI, and ML teams

    to ensure reliable and production-ready datasets with strong

    data contracts

    .
  • Support production operations and participate in

    on-call rotations

    for critical pipeline support (as needed).

Required Skills & Experience

  • Hands-on expertise in

    GCP data stack

    : BigQuery, Dataflow (Apache Beam), Dataproc, Cloud Storage, Pub/Sub, Cloud Composer (Airflow).
  • Strong command of

    Spark (PySpark or Scala)

    for batch processing.
  • Solid understanding of

    Airflow DAG design

    (idempotency, retries, SLAs, backfills).
  • Advanced

    SQL

    and

    data modeling

    (star/snowflake schemas, SCD, partitioning strategies).
  • Proficiency in

    Python

    (preferred) or

    Scala/Java

    for data engineering.
  • Experience with

    Git

    and

    CI/CD

    tools (Cloud Build, GitHub Actions, or GitLab CI).
  • Familiarity with

    GCP security and governance

    concepts (IAM, service accounts, secrets management, VPC-SC basics).
  • Strong

    debugging skills

    , proactive

    ownership mindset

    , and clear communication with both technical and non-technical stakeholders.

Good-to-Have Skills

  • Experience with

    Snowflake

    (migration, performance tuning, tasks/streams).
  • Knowledge of

    dbt

    ,

    Great Expectations

    , or other data quality/testing frameworks.
  • Terraform

    for Infrastructure as Code on GCP.
  • Exposure to

    Kafka

    or similar streaming tools;

    Cloud Run/Functions

    for integration services.
  • Familiarity with

    BI tools

    such as Looker, Looker Studio, Tableau, or Power BI.
  • GCP Professional Data Engineer Certification

    is an added advantage.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Randomtrees logo
Randomtrees

Technology - Machine Learning

Tech City

RecommendedJobs for You

gurugram, chennai, bengaluru

hyderabad, chennai, bengaluru

pune, chennai, bengaluru

gurugram, haryana, india