Data Engineer - Google Cloud Platform

4 years

0 Lacs

Sadar, Uttar Pradesh, India

Posted:3 days ago| Platform: Linkedin logo

Apply

Skills Required

data gcp design architecture etl extraction support test strategies reliability engineering agile development analysis management security python automate sql scheduling orchestration apache airflow unix automation monitoring analytics code terraform ansible puppet configuration linux scripting debugging software mathematics certification communication

Work Mode

On-site

Job Type

Full Time

Job Description

GCP Data Engineer We are looking for a GCP Data Engineer to design, develop, and maintain scalable data pipelines and cloud-based data platforms. You will work on building and optimizing data workflows, implementing robust data solutions using Google Cloud Platform (GCP) technologies, and collaborating closely with cross-functional teams to deliver high-impact, data-driven insights. This role requires a deep understanding of data architecture, GCP ecosystem, ETL/ELT processes, and the ability to lead, mentor, and execute with precision. Key Responsibilities Design, build, and maintain robust data extraction, transformation, and loading (ETL/ELT) pipelines across both on-premises and cloud platforms. Develop and support data products, pipelines, and analytical platforms leveraging GCP services. Perform application impact assessments, requirement reviews, and provide accurate work estimates. Create test strategies and implement site reliability engineering (SRE) measures for data systems. Participate in agile development sprints and contribute to solution design reviews. Mentor and guide junior Data Engineers on best practices and design patterns. Lead root cause analysis and resolution of critical data operations and post-implementation issues. Conduct technical data stewardship activities, including metadata management, data security, and privacy-by-design principles. Use Python and GCP technologies to automate data workflows and transformations. Work with SQL for data modeling, transformations, and analytical queries. Automate job scheduling and orchestration using Control-M, Apache Airflow, or Prefect. Write Unix shell scripts to support automation and monitoring of data operations. Support BI/analytics teams with structured and well-modeled data. Use Infrastructure as Code (IaC) tools like Terraform, Ansible, or Puppet for automated deployments and configuration management. Required Skills & Technologies Strong experience with Python, SQL, and Unix/Linux scripting. Proficient in GCP Data Services. Experience in designing and managing ETL/ELT pipelines across hybrid environments. Working knowledge of orchestration tools: Apache Airflow, Control-M, or Prefect. Understanding of modern data warehousing and cloud-based analytics architecture. Familiarity with Infrastructure-as-Code using Terraform, Puppet, or Ansible. Strong debugging and problem-solving abilities in complex data environments. Ability to work in Agile teams and deliver in short sprint cycles. Qualifications Bachelors degree in Computer Science, Software Engineering, Data Science, Mathematics, or related field. 4+ years of hands-on experience in data engineering. 2+ years of experience in data architecture and solution design. GCP Certified Data Engineer certification is preferred. Excellent communication skills and the ability to collaborate with cross-functional teams. (ref:hirist.tech) Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now

RecommendedJobs for You