GCP Data Engineer

4 - 7 years

1 - 2 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

GCP Data Engineer (Mid Level)

Position Overview:

We are seeking a mid-level GCP Data Engineer with 4+ years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles.

Qualifications:

4+ years of experience in ETL & Data Warehousing

Should have excellent leadership & communication skills

Should have experience in developing Data Engineering solutions Airflow, GCP BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc.

Should have built solution automations in any of the above ETL tools

Should have executed at least 2 GCP Cloud Data Warehousing projects

Should have worked at least 2 projects using Agile/SAFe methodology

Should Have mid level experience in Pyspark and Teradata

Should Have mid level experience in

Should have working experience on any DevOps tools like GitHub, Jenkins, Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data analysis and extraction

Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping

Education:

  • Certifications:

     


Roles and Responsibilities

Roles and Responsibilities:

  • Design, develop, and deploy data pipelines and ETL processes using Azure Data Factory.
  • Implement data integration solutions, ensuring data flows efficiently and reliably between various data sources and destinations.
  • Collaborate with data architects and analysts to understand data requirements and translate them into technical specifications.
  • Build and maintain scalable and optimized data storage solutions using Azure Data Lake Storage, Azure SQL Data Warehouse, and other relevant Azure services.
  • Develop and manage data transformation and cleansing processes to ensure data quality and accuracy.
  • Monitor and troubleshoot data pipelines to identify and resolve issues in a timely manner.
  • Optimize data pipelines for performance, cost, and scalability

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

hyderabad, chennai, bengaluru