Gcp Data Engineer

5 - 10 years

15 - 25 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Position:

Role Overview

GCP Data Engineer

Key Responsibilities

  • Design, build, and manage

    data pipelines and ETL/ELT workflows

    on

    GCP

    using tools such as

    Cloud Composer (Airflow)

    ,

    Dataflow

    ,

    Dataproc

    , or

    Datafusion

    .
  • Develop, optimize, and maintain

    data models in BigQuery

    for analytics and reporting use cases.
  • Implement

    DBT transformations

    to manage and version-control SQL-based data transformations.
  • Write efficient and optimized

    SQL queries

    for data extraction, aggregation, and transformation.
  • Build and maintain

    dashboards and reports in Looker

    , ensuring accuracy, usability, and scalability for business users.
  • Develop data solutions using

    Python or PySpark

    for automation, transformations, and advanced data processing.
  • Collaborate with data analysts, scientists, and business stakeholders to define requirements and deliver high-quality solutions.
  • Ensure

    data quality, governance, and security

    best practices are followed in all pipelines.
  • Participate in

    Agile/Scrum

    delivery cycles and support production deployments.
  • Stay updated on emerging

    cloud data and GenAI trends

    , contributing innovative ideas for adoption.

Required Skills & Competencies

  • BigQuery

    – Advanced SQL, data modeling, query optimization.
  • Orchestration:

    Airflow / Cloud Composer (must have).
  • ETL/ELT Tools:

    Dataflow, Dataproc, or Datafusion (at least one).
  • Programming:

    Python or PySpark (must have).
  • DBT:

    Building and maintaining transformations (must have).
  • SQL:

    Strong query writing and optimization skills.
  • Visualization:

    Looker (must have – building dashboards, LookML).

Nice to Have Skills

  • GKE (Google Kubernetes Engine)

    – for containerized workloads.
  • Cloud Spanner

    – familiarity with distributed relational database.
  • Harness

    – CI/CD pipeline automation.
  • GenAI exposure

    – for integration with data workflows.

Soft Skills

  • Strong problem-solving and analytical thinking.
  • Excellent communication and stakeholder management skills.
  • Ability to work independently as well as collaboratively in cross-functional teams.
  • Experience in Agile/Scrum environments.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You