Gcp Data Engineer

2 - 6 years

5 - 10 Lacs

Posted:13 hours ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities:

  • Design & Implement Data Pipelines: Develop and optimize ETL/ELT pipelines using Dataflow, BigQuery, and Cloud Composer (Airflow).
  • Data Integration: Work with structured and unstructured data sources, integrating data from on-premise and cloud-based systems.
  • Data Warehousing & Modeling: Design high-performance data models in BigQuery, ensuring scalability and cost efficiency.
  • Automation & Infrastructure as Code (IaC): Implement Terraform for provisioning GCP resources and automate deployments.
  • Streaming & Batch Processing: Work with Pub/Sub, Dataflow (Apache Beam), and Kafka for real-time and batch data processing.

Required Skills & Qualifications:

  • Education: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.
  • 7+ years of experience in data engineering, cloud data solutions, and pipeline development.
  • GCP Expertise: Hands-on experience with BigQuery, Scala, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow), Vertex AI, and IAM Policies.
  • Programming: Proficiency in Python, SQL

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Fractal Analytics logo
Fractal Analytics

Analytics and Artificial Intelligence

Bangalore

RecommendedJobs for You

noida, gurugram, delhi / ncr

chennai, sholinganallur

bengaluru, karnataka, india