GenAI Engg.- MLSE/Lead

10 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Qualification

We are looking for a

Senior Data Engineer

who combines deep data engineering expertise with hands-on experience in

Generative AI

and

Agentic AI

system development on

AWS Cloud

.This role is ideal for someone who can design, build, and deploy

production-grade GenAI workflows

integrating LLMs, vector databases, and orchestration frameworks—with the same rigor as a traditional data system.

Key Responsibilities:

  • Design and maintain data pipelines and AI data infrastructure on AWS (Glue, Lambda, S3, Redshift, Step Functions, Athena, etc.).
  • Develop and deploy LLM-based applications and Agentic AI workflows using frameworks like LangChain, LlamaIndex, or AutoGen.
  • Build RAG (Retrieval-Augmented Generation) pipelines using AWS services (S3 + Bedrock + SageMaker + OpenSearch/Vector DB).
  • Implement agentic reasoning, tool calling, and orchestration for multi-agent workflows.
  • Containerize and deploy AI services using Docker, ECS, or EKS, ensuring scalability, cost-efficiency, and observability.
  • Integrate AWS Bedrock, SageMaker, or OpenAI APIs with internal data systems and applications.
  • Set up monitoring, tracing, and model observability using AWS CloudWatch, X-Ray, or third-party LLMOps tools.
  • Collaborate with ML engineers, data scientists, and architects to take GenAI prototypes to production-ready deployments.

Required Skills & Experience:

  • 6–10 years of total experience in Data Engineering with strong AWS background.
  • Proficiency in Python and SQL with hands-on work in PySpark, Airflow, or Glue.
  • Hands-on experience with GenAI solutions in real-world environments (not just demos or PoCs).
  • Working knowledge of Agentic AI frameworks (LangChain, LlamaIndex, AutoGen, or similar).
  • Experience with RAG architecture, vector databases (OpenSearch, Pinecone, FAISS, Chroma, or Milvus), and embedding models.
  • Understanding of LLMOps, prompt lifecycle management, and performance monitoring.
  • Practical experience deploying workloads on AWS ECS/EKS, setting up CI/CD pipelines, and managing runtime performance.
  • Familiarity with IAM, VPC, Secrets Manager, and security best practices in cloud environments.

Nice To Have:

  • Experience with AWS Bedrock for model hosting or SageMaker for fine-tuning and evaluation.
  • Exposure to multi-agent architectures and autonomous task orchestration.
  • Contributions to open-source GenAI projects or internal AI platform initiatives.

Role

We are looking for a

Senior Data Engineer

who combines deep data engineering expertise with hands-on experience in

Generative AI

and

Agentic AI

system development on

AWS Cloud

.This role is ideal for someone who can design, build, and deploy

production-grade GenAI workflows

integrating LLMs, vector databases, and orchestration frameworks—with the same rigor as a traditional data system.

Nice To Have:

  • Experience with AWS Bedrock for model hosting or SageMaker for fine-tuning and evaluation.
  • Exposure to multi-agent architectures and autonomous task orchestration.
  • Contributions to open-source GenAI projects or internal AI platform initiatives.

Experience

6 to 12 years

Job Reference Number

13295

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

noida, uttar pradesh, india

noida, uttar pradesh, india