GCP BigQuery

4 - 6 years

10 - 18 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title:

Location:

Experience:

Employment Type

About the Role

We are looking for a skilled Data Engineer with strong experience in Google Cloud Platform (GCP) and BigQuery to design, build, and optimize data pipelines and analytics solutions. The ideal candidate will have hands-on experience in large-scale data processing, ETL development, and data modeling, and will collaborate closely with data analysts, data scientists, and business stakeholders.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT data pipelines on GCP (BigQuery, Dataflow, Cloud Composer, Pub/Sub, etc.).
  • Build and optimize data models and data marts in BigQuery for analytical and reporting use cases.
  • Ensure data quality, integrity, and security across the data lifecycle.
  • Implement data transformation logic using SQL, Python, and Cloud Dataflow/Dataproc.
  • Collaborate with business and analytics teams to understand data requirements and deliver efficient solutions.
  • Automate workflows and orchestrate pipelines using Cloud Composer (Airflow).
  • Monitor and optimize BigQuery performance and manage cost efficiency.
  • Support CI/CD deployment processes and maintain version control using Git.
  • Work with structured and unstructured data sources (APIs, streaming, flat files, etc.).
  • Develop and maintain data documentation and best practices for data engineering.

Required Skills & Qualification

  • Bachelor's degree in computer science, Engineering, or related field.
  • 46years of experience in data engineering or similar roles.
  • Strong expertise in Google Cloud Platform (GCP) services, especially:
  1. BigQuery (query optimization, partitioning, clustering)
  2. Dataflow / Dataproc / Cloud Composer / Pub/Sub
  3. Cloud Storage and IAM
  • Strong SQL skills for data analysis, transformation, and optimization.
  • Proficiency in Python or Java/Scala for ETL/ELT workflows.
  • Experience with data modeling (dimensional/star schema) and data warehouse architecture.
  • Familiarity with CI/CD, Git, and infrastructure-as-code (Terraform, Cloud Deployment Manager).
  • Experience in handling large datasets, performance tuning, and cost optimization in BigQuery

Preferred Skills

  • Exposure to machine learning pipelines or data lakehouse architecture.
  • Experience integrating GCP data with Looker / Tableau / Power BI.
  • Knowledge of Apache Beam, Spark, or Airflow DAGs.
  • GCP Certification (e.g., Professional Data Engineer) is a strong plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
PwC India logo
PwC India

Business Consulting and Services

Kolkata West Bengal

RecommendedJobs for You

gurugram, bengaluru