GCP Data Engineer (4+ years)

4 years

0 Lacs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

GCP Data Engineer

Key Responsibilities:

  • Design and implement scalable and efficient data pipelines using

    GCP services

    , especially

    BigQuery

    ,

    Dataflow

    ,

    Cloud Storage

    , and

    Pub/Sub

    .
  • Develop data ingestion frameworks to handle structured and semi-structured data from various sources in batch and real-time modes.
  • Transform raw data into clean, curated datasets for downstream consumption using

    ETL/ELT best practices

    .
  • Optimize BigQuery queries and table structures for performance and cost efficiency.
  • Implement robust data quality and validation checks throughout the data pipeline lifecycle.
  • Collaborate with data analysts, scientists, and business stakeholders to understand data requirements and deliver high-quality solutions.
  • Manage metadata, data lineage, and cataloging using tools such as

    Data Catalog

    .
  • Monitor, troubleshoot, and optimize data workflows to ensure system reliability and data integrity.
  • Follow best practices in version control, CI/CD, and infrastructure-as-code for data engineering workflows.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degree

    in Computer Science, Information Technology, Engineering, or a related field.
  • 4+ years of experience

    in Data Engineering, with a strong focus on GCP.
  • Expertise in

    BigQuery

    ,

    Dataflow (Apache Beam)

    , and

    Cloud Storage

    .
  • Proficiency in

    SQL

    ,

    Python

    , or

    Java

    for data processing and pipeline development.
  • Experience with

    ETL/ELT design

    patterns and tools.
  • Familiarity with

    Pub/Sub

    ,

    Cloud Functions

    , and

    Composer (Airflow)

    is a plus.
  • Strong understanding of

    data modeling

    ,

    data warehousing concepts

    , and

    performance tuning

    .
  • Experience working in

    Agile/Scrum

    environments with DevOps and CI/CD principles.
  • Excellent problem-solving and communication skills.

Preferred Qualifications:

  • GCP Professional Data Engineer Certification.
  • Experience with data security, governance, and compliance standards.
  • Exposure to other cloud platforms (AWS, Azure) is an added advantage.
  • Knowledge of machine learning workflows and MLOps is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You