Gcp Data Engineer

5 - 10 years

15 - 19 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Mandatory skill- Advance Python & SQL GCP Services- BigQuery, Dataflow, Dataproc and Pub/Sub.

Key Responsibilities

  • Design, develop and optimize scalable data pipelines and ETL workflows using Google Cloud Platform (GCP), particularly leveraging BigQuery, Dataflow, Dataproc and Pub/Sub.
  • Design and manage secure, efficient data integrations involving Snowflake and BigQuery.
  • Write, test and maintain high-quality Python code for data extraction, transformation and loading (ETL), analytics and automation tasks.
  • Use Git for collaborative version control, code reviews and managing data engineering projects.
  • Implement infrastructure-as-code practices using Pulumi for cloud resources management and automation within GCP environments.
  • Apply clean room techniques to design and maintain secure data sharing environments in alignment with privacy standards and client requirements.
  • Collaborate with cross-functional teams (data scientists, business analysts, product teams) to deliver data solutions, troubleshoot issues and assure data integrity throughout the lifecycle.
  • Optimize performance of batch and streaming data pipelines, ensuring reliability and scalability.
  • Maintain documentation on processes, data flows and configurations for operational transparency.

________________________________________

Required Skills

  • Strong hands-on experience of 5+ years with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub.
  • Proficiency in data engineering development using Python.
  • Deep familiarity with Snowflakedata modeling, secure data sharing and advanced query optimization.
  • Proven experience with Git for source code management and collaborative development.
  • Demonstrated ability using Pulumi (or similar IaC tools) for deployment and support of cloud infrastructure.
  • Practical understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations.
  • Solid skills in debugging complex issues within data pipelines and cloud environments.
  • Effective communication and documentation skills.

________________________________________

Great to Have

  • GCP certification (e.g., Professional Data Engineer).
  • Experience working in regulated environments (telecom/financial/healthcare) with data privacy and compliance focus.
  • Exposure to additional GCP services such as Cloud Storage, Cloud Functions or Kubernetes.
  • Demonstrated success collaborating in agile, distributed teams.
  • Experience with data visualization tools (e.g., Tableau, Looker) is nice to have.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
TELUS International logo
TELUS International

Telecommunications / Customer Experience

Edmonton

RecommendedJobs for You

pune, chennai, bengaluru