GCP Data Engineer

2 - 5 years

7 - 8 Lacs

Posted:7 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description





The Opportunity: Full Stack Data Engineer

Were seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, youll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. Youll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford.



















What Youll Bring: (Qualifications)


  • Bachelors degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience).

  • 3-5 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred).

  • Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc.

  • Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform.

  • Experience with relational databases (e. g. , PostgreSQL, MySQL), NoSQL databases, and columnar databases (e. g. , BigQuery).

  • Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments.

  • Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks.

  • Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues.

  • Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e. g. , BigQuery, Dataflow, Cloud Run, DataProc).

  • A passion for data, innovation, and continuous learning.














What Youll Do: ( Responsibilities)

Data Pipeline Architect & Builder:

End-to-End Integration Expert:

GCP Data Solutions Leader

Data Governance & Security Champion

Data Workflow Orchestrator

Performance Optimization Driver

Collaborative Innovator

Automation & Reliability Advocate

Effective Communicator

Continuous Learner

Business Impact Translator

Documentation & Knowledge Sharer


















Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Ford logo
Ford

Automotive

Dearborn

RecommendedJobs for You

Ahmedabad, Gujarat, India

Bengaluru, Karnataka, India

Kolkata, West Bengal, India