Data Engineer (AI/ML | Python | GCP ) Experience Level: 5+ years - WFH

0 years

0 Lacs

Posted:3 days ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

🔥 Senior Data Engineer (AI/ML | Python | GCP ) - WFHTHIS IS FULLY REMOTE WORKING OPPORTUNITY.

WE NEED IMMEDIATE JOINERS or someone who can join in less than 1 month.If you are interested and fulfill the below mentioned criteria then please share below information.1. EMAIL ID2. YEARS OF RELEVANT EXPERIENCE.3. UPDATED RESUME.4. CCTC/ECTC5. Notice period

*Key Responsibilities

* 1.

Design & Develop Scalable Data Pipelines

  • Build high-throughput, low-latency pipelines to process massive datasets efficiently.
  • Ensure pipelines can handle 5–10 TB of structured and unstructured data at scale.

2. Data Filtering & Curation

  • Optimize data ingestion and transformation for downstream models.
  • Implement intelligent scanning mechanisms to filter out relevant data for AI/ML models

3. AI/ML Integration

  • Leverage GenAI models (e.g., Gemini Pro, OpenAI, GPT) under the Vertex AI umbrella.
  • Build and fine-tune custom models when LLMs are cost-prohibitive or less effective.
  • Design multi-layered model pipelines balancing accuracy, cost, and performance.

4. Cloud Infrastructure

  • Architect scalable solutions primarily on GCP (Vertex AI, BigQuery, GKE, Cloud run, PostgreSQL, Dataflow, Pub/Sub).
  • Azure knowledge is a plus but not mandatory.
  • Experienced in building robust monitoring system.

5. Performance Optimization

  • Ensure real-time, low-latency processing for high-volume workloads.
  • Optimize GPU/CPU-intensive environments to manage cost and efficiency.

6. Collaboration & Ownership

  • Work closely with data scientists, ML engineers, and solution architects.
  • Take end-to-end ownership of assigned tasks and deliverables, working independently where needed.
  • Collaborate with global teams and manage 3–4 hours of timezone overlap when required.

*Required Skills & Qualifications

* 1.

Programming

* Expert-level Python (mandatory).

2. Data Engineering

:* Hands-on experience with streaming pipelines, and large-scale data ingestion frameworks.

3. AI/ML Expertise

:* Experience integrating and fine-tuning LLMs and GenAI models.* Comfortable working with custom model development where LLMs cannot be applied.* Familiarity with OpenAI, Gemini, Vertex AI, or equivalent platforms.

4. Cloud

:* GCP (mandatory) — experience with Vertex AI, BigQuery, GKE, Cloud Run, PostgreSQL, Dataflow, Pub/Sub, etc.* Azure experience is a plus.

5. Scalability & Low Latency

:* Proven ability to design and implement high-performance, distributed data solutions.

6. High-Volume Data Processing

:* Experience processing massive datasets efficiently within strict timeframes.

*Good to Have

*
  • Experience with MLOps and deploying models at scale.
  • Knowledge of data governance and compliance when working with sensitive data.
  • Hands-on exposure to event-driven architectures and microservices.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You