Bigdara Developer

3 - 5 years

4 - 8 Lacs

Posted:7 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Tech Stack & Platform Overview

The solution is built for

scalable, enterprise-grade Agentic AI solutions

, enabling efficient data ingestion, processing, and delivery to AI agents.

 

Key components include:

 

Mandatory Technical Skills

 

Python, Google Cloud Platform

  • Core

    data ingestion and transformation pipelines

  • Data layers supporting

    RAG workflows

    and AI agents
  • LLM observability

    and logging data pipelines
  • AgentOps

    -related operational data pipelines
  • Integration with agent frameworks such as

    CrewAI, AutoGen, and LangGraph

  • A

    modular, cloud-native architecture

    for rapid iteration and deployment
Essential functions

Nice-to-Have Requirements

  • Hands-on experience

    designing, building, and deploying data pipelines

  • Experience with

    ETL/ELT pipelines

    for structured and unstructured data
  • Familiarity with

    batch and streaming data processing

  • Understanding of

    data quality, monitoring, and reliability

    practices
  • Exposure to

    cloud-based data platforms

    and deployment workflows
  • Basic understanding of how data pipelines support

    AI/LLM or RAG use cases

  • Ability to collaborate with AI engineers, platform engineers, and product teams
  • Willingness to learn and adapt in a

    fast-paced environment

    3.
Qualifications

Tech Stack & Platform Overview

The solution is built for

scalable, enterprise-grade Agentic AI solutions

, enabling efficient data ingestion, processing, and delivery to AI agents.
Key components include:

Years of Experience- 3 to 5 Years

Mandatory Technical Skills

Python, Google Cloud Platform

  • Core

    data ingestion and transformation pipelines

  • Data layers supporting

    RAG workflows

    and AI agents
  • LLM observability

    and logging data pipelines
  • AgentOps

    -related operational data pipelines
  • Integration with agent frameworks such as

    CrewAI, AutoGen, and LangGraph

  • A

    modular, cloud-native architecture

    for rapid iteration and deployment
Would be a plus

Nice-to-Have Requirements

  • Hands-on experience

    designing, building, and deploying data pipelines

  • Experience with

    ETL/ELT pipelines

    for structured and unstructured data
  • Familiarity with

    batch and streaming data processing

  • Understanding of

    data quality, monitoring, and reliability

    practices
  • Exposure to

    cloud-based data platforms

    and deployment workflows
  • Basic understanding of how data pipelines support

    AI/LLM or RAG use cases

  • Ability to collaborate with AI engineers, platform engineers, and product teams
  • Willingness to learn and adapt in a

    fast-paced environment

    3.
We offer
  • Opportunity to work on bleeding-edge projects
  • Work with a highly motivated and dedicated team
  • Competitive salary
  • Flexible schedule
  • Benefits package - medical insurance, sports
  • Corporate social events
  • Professional development opportunities
  • we'll-equipped office

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Grid Dynamics logo
Grid Dynamics

Information Technology and Services

Los Altos

RecommendedJobs for You