GCP Data Engineer

4 - 6 years

15 - 20 Lacs

Posted:Just now| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

GCP Data Engineer

Roles & Responsibilities

  • Design & Develop Data Pipelines

    • Design, develop, and maintain

      scalable, secure, and reliable data pipelines

      using GCP services (BigQuery, Dataflow, Cloud Composer, GCS, etc.).
    • Build end-to-end data ingestion and transformation workflows from various source systems (databases, APIs, flat files, etc.).
  • Data Migration & Onboarding

    • Work with client business and technical teams to build

      complex data migration pipelines

      and seamlessly onboard new data products to the GCP environment.
    • Streamline and optimize data migration workflows to add measurable value to

      client data hubs

      and analytics platforms.
  • Data Transformation & Curation

    • Build robust ETL/ELT processes to

      cleanse, transform, and curate data

      into analytics-ready datasets and views.
    • Ensure data quality, integrity, and consistency across stages with proper validation, logging, and monitoring.
  • Frameworks & Reusability

    • Design and build

      reusable frameworks

      for ingesting different data patterns (batch, streaming, CDC, etc.), reducing time-to-market for new data products.
    • Create and maintain

      technical documentation

      for pipelines, frameworks, and processes.
  • Collaboration & Agile Delivery

    • Work in an

      agile environment

      , participating in sprint planning, daily stand-ups, and retrospectives.
    • Collaborate with data architects, analysts, SMEs, and business stakeholders to translate requirements into scalable data solutions.
  • Deployment, CI/CD & Operations

    • Use tools like

      Spinnaker, Jenkins, and Git

      to implement CI/CD for data pipelines and related components.
    • Deploy and manage data workloads on

      GKE

      or other GCP compute services where required.
    • Monitor performance, optimize SQL queries and pipeline performance, and troubleshoot issues in production.

Required Experience

  • Total Experience:

    46 years in data engineering / ETL / big data roles.
  • Relevant Experience:

    Minimum 3+ years working on

    GCP-based data engineering

    projects.

Must-Have Skills (Primary)

  • Python (OOPs concept):

    Strong programming skills with clean, modular, and testable code.
  • GCP Data Services:

    • Dataflow

      (or Apache Beam) for building scalable data pipelines.
    • Cloud Composer

      (Airflow) for workflow orchestration and scheduling.
    • BigQuery

      for data warehousing, analytics, and performance-optimized SQL.
  • Strong understanding of

    ETL/ELT concepts

    , data integration patterns, and data warehousing fundamentals.

Secondary Skills (Good Working Knowledge)

  • Agile methodologies

    (Scrum/Kanban) working in sprint-based delivery.
  • Spinnaker

    for deployments in GCP environments.
  • Jenkins

    for building CI/CD pipelines.
  • Git

    version control and collaborative development.
  • GKE (Google Kubernetes Engine)

    understanding of containerized workloads and orchestration.

Nice-to-Have Skills (Preferred, Not Mandatory)

  • Data Modelling Basics

    star/snowflake schemas, dimensional modelling, understanding of fact and dimension tables.
  • Tableau Basics

    – exposure to building or supporting dashboards and BI analytics.
  • SQL Performance Tuning

    – experience optimizing complex queries and improving BigQuery performance.

Educational Qualification

  • B.Tech / B.E. / M.Tech / MCA or equivalent degree in

    Computer Science, IT, Data Engineering, or related fields

    .

Desired Candidate Profile

  • Strong analytical and problem-solving skills with attention to detail.
  • Ability to work independently as well as part of a collaborative, cross-functional team.
  • Good communication skills to interact with both technical and non-technical stakeholders.
  • Proactive mindset with a focus on

    automation, reusability, and continuous improvement

    .

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

bengaluru, karnataka, india