Consultant – Cloud Data Engineer – GCP

2 - 3 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities:

  • Design, develop, and manage scalable, secure, and efficient data pipelines on Google Cloud Platform (GCP) to process and transform large datasets from multiple sources.
  • Implement and optimize GCP data services such as BigQuery, Cloud Storage, Cloud SQL, Dataflow, Dataproc, Pub/Sub, and Cloud Functions.
  • Architect and maintain ETL/ELT processes to ensure efficient data ingestion, transformation, and storage using Cloud Data Fusion, Dataflow (Apache Beam), or Composer (Airflow).
  • Architect and implement data models on GCP to support efficient data storage and retrieval in BigQuery, Cloud Spanner, or Firestore.
  • Collaborate with data architects to design and implement data lakes, data warehouses, and data marts on GCP.
  • Build and maintain data integration workflows using tools like Cloud Composer (Apache Airflow), Cloud Functions, or Dataflow.
  • Utilize GCP DevOps tools for source control, build automation, release management, and infrastructure as code (IaC) using Terraform or Deployment Manager.
  • Implement CI/CD pipelines within Google Cloud Build to automate the build, test, and deployment of data pipelines and infrastructure changes, ensuring rapid and reliable delivery of data solutions.
  • Utilize big data technologies such as Dataproc (Apache Spark, Hadoop, etc.) to handle large volumes of data and perform complex analytics tasks.
  • Implement real-time data processing solutions using streaming technologies like Pub/Sub and Dataflow (Apache Beam) to enable timely insights and actions.
  • Implement data governance policies and security controls within GCP environments to protect sensitive data and ensure compliance with regulatory requirements such as GDPR, HIPAA, or PCI DSS.
  • Optimize data pipelines and processing workflows for performance, scalability, and cost-effectiveness on GCP, leveraging BigQuery optimizations, Autoscaling, and cost management techniques.
  • Monitor, troubleshoot, and optimize the performance of data pipelines and cloud-based systems to ensure high availability, low latency, and scalability.
  • Work closely with cross-functional teams to understand business requirements and translate them into scalable GCP data solutions.
  • Ensure data security and compliance with industry best practices, including IAM roles, VPC configurations, encryption, and access controls.
  • Conduct regular code reviews, provide feedback, and ensure adherence to best practices and standards.
  • Stay up to date with the latest GCP services and cloud technologies to drive innovation within the team.

Job Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 2-3 years of experience as a Cloud Data Engineer or similar role, with a strong focus on GCP cloud services.
  • Proficiency in designing and building scalable data architectures using GCP data services such as BigQuery, Cloud Storage, Dataflow, Cloud SQL, and Dataproc.
  • Strong experience with ETL/ELT frameworks and tools like Cloud Composer (Apache Airflow), Cloud Data Fusion, or Dataflow (Apache Beam).
  • Expertise in SQL, Python, or other programming languages used in data engineering.
  • Hands-on experience with data lakes, data warehouses, and data pipeline orchestration on GCP.
  • Familiarity with CI/CD pipelines and infrastructure-as-code (IaC) tools like Terraform, Deployment Manager, or Cloud Build.
  • Understanding data governance, security, and compliance standards, including encryption, IAM policies, and role-based access control (RBAC).
  • Experience in data modeling, data normalization, and performance optimization for cloud-based data solutions.
  • GCP Certifications such as Google Cloud Professional Data Engineer or Google Cloud Professional Cloud Architect.
  • Experience with Apache Spark or Databricks on GCP.
  • Knowledge of machine learning workflows and working with data science teams is a plus.
  • Familiarity with DevOps practices and tools such as Docker and Kubernetes.
  • Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
  • Strong problem-solving and troubleshooting skills, with a proactive approach to identifying and resolving issues.
  • Ability to adapt to a fast-paced, agile environment and manage multiple priorities effectively.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You