Gcp Data Engineer

7 - 10 years

20 - 27 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role & responsibilities

Key Responsibilities:

  • Design, develop, and maintain scalable and efficient 

    data pipelines

     and 

    ETL/ELT workflows

     using GCP services.
  • Architect and implement 

    data warehouse solutions

     using 

    BigQuery

    , ensuring optimal performance and cost efficiency.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights.
  • Optimize and manage large-scale datasets, ensuring data quality, integrity, and security.
  • Develop and enforce best practices for 

    data governance

    data modeling

    , and 

    data lifecycle management

    .
  • Implement 

    streaming data solutions

     using tools like 

    Dataflow

    Pub/Sub

    , and 

    Apache Beam

    .
  • Monitor and troubleshoot data pipelines, ensuring high availability and reliability.
  • Lead the migration of on-premise data systems to GCP, ensuring seamless integration and minimal downtime.
  • Mentor and guide junior engineers, fostering a culture of continuous learning and innovation.
  • Stay updated with the latest advancements in GCP and data engineering technologies, and recommend improvements to existing systems.

Required Skills and Qualifications:

  • 10+ years

     of experience in data engineering, with a focus on cloud-based solutions.
  • Expertise in GCP services

    , including but not limited to:
  • BigQuery

  • Dataflow

  • Pub/Sub

  • Cloud Storage

  • Cloud Composer (Airflow)

  • Cloud Functions

  • Strong proficiency in 

    SQL

     and experience with 

    BigQuery SQL

     for complex queries and performance optimization.
  • Hands-on experience with 

    Python

     or 

    Java

     for building data pipelines and automation.
  • Deep understanding of 

    data modeling

    data warehousing

    , and 

    schema design

    .
  • Experience with 

    streaming data processing

     and tools like 

    Apache Beam

     or 

    Kafka

    .
  • Familiarity with 

    CI/CD pipelines

     and version control systems like 

    Git

    .
  • Strong knowledge of 

    data security

     and 

    compliance standards

     (e.g., GDPR, HIPAA).
  • Proven experience in 

    migrating on-premise data systems

     to GCP.
  • Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment.
  • Strong communication and collaboration skills, with the ability to work effectively with both technical and non-technical stakeholders.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Hsbc logo
Hsbc

Financial Services

London

RecommendedJobs for You

noida, uttar pradesh, india

pune, bengaluru, mumbai (all areas)

hyderabad, chennai, bengaluru

hyderabad, chennai, bengaluru