AI Engineer (Conversational Analytics & GenAI Systems)

3 - 5 years

0 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

About the Product

You will work on IRISS's conversational analytics platform, a GenAI-powered chatbot that transforms natural language queries into validated, compliant, and tenant-aware SQL and visual insights. This platform enables users to ask business questions like Show me last month's motor temperature anomalies in Plant 3 and get immediate, accurate dashboards and reports generated safely through AI-driven data pipelines.

Our AI stack:
- Interprets user intent using LLMs. - Generates validated, policy-compliant SQL. - Executes and visualizes data with context and feedback loops. - Powers a RAG-based (Retrieval-Augmented Generation) framework integrated with existing IoT and analytics microservices.

Role Overview

You will design, develop, and maintain the AI chatbot platform that serves as the intelligence layer for our SaaS ecosystem. This includes architecting end-to-end conversational pipelines from LLM prompt design to data retrieval, integrating vector-based search systems and RAG pipelines into our service mesh, leveraging AWS AI/ML and orchestration services such as Bedrock, Kendra, OpenSearch, Lambda, ECS, and S3 to build scalable and secure infrastructure, and partnering with full-stack and front-end engineers to embed AI features directly into user workflows.

Core Stack & Technologies

AI/ML & Data Intelligence

- Python 3.10+ (FastAPI, LangChain, Haystack, or equivalent)
- LLMs: OpenAI, Anthropic, Hugging Face, or open-source models (LLaMA, Mistral, Falcon) - RAG Systems: FAISS, Pinecone, OpenSearch Vector Store, or ChromaDB - Prompt Orchestration: LangChain, Semantic Kernel, or internal tooling - Data Validation & Safety: SQL sanitization layers and policy enforcement modules - Visualization Layer: Chart.js or D3.js integration for generated insights

Cloud & Infrastructure

- AWS Bedrock, Kendra, OpenSearch, Lambda, S3, CloudWatch, ECS, and EC2
- API Gateway for AI microservices - Redis or DynamoDB for caching and conversation state - OpenTelemetry for observability - CI/CD using GitHub Actions, AWS CDK, and Docker-based microservices

Front-End & Integration

- Works closely with Angular 18+ applications and .NET/Python backend microservices
- Exposes APIs to the Full-Stack and Front-End teams for seamless user interactions - Implements real-time feedback mechanisms for model evaluation and tuning

Key Responsibilities

- Architect, develop, and maintain the GenAI chatbot platform from the ground up
- Build multi-turn conversation flows and contextual memory for data queries - Implement RAG pipelines using vector databases and curated embeddings - Integrate open-source and commercial LLMs through APIs or local deployment - Create safety and compliance modules that validate SQL and policy rules before execution - Collaborate with backend engineers to design AI microservices that scale horizontally - Deploy, monitor, and optimize models using AWS Bedrock, Kendra, and OpenSearch - Maintain observability and feedback loops for improving model accuracy and reliability - Partner with front-end teams to deliver chat-first analytics interfaces - Contribute to documentation, testing, and architectural decision records for AI systems

Qualifications

Must-Have

- Bachelor's or Master's degree in Computer Science, Data Science, or a related field
- Minimum 3 years of experience developing and deploying AI-powered applications or chatbots - Strong Python expertise (FastAPI, Flask, or Django for microservices) - Experience with LLM integration (OpenAI, Bedrock, Hugging Face, or local models) - Hands-on experience with AWS ecosystem including Bedrock, Kendra, OpenSearch, ECS, Lambda, and CloudWatch - Deep understanding of RAG architecture, vector databases, and embeddings-based retrieval - Knowledge of prompt design, model orchestration, and AI safety validation - Familiarity with SQL and multi-tenant data systems - Experience with Docker, Git-based CI/CD, and microservice architectures

Nice-to-Have

- Experience fine-tuning or hosting open-source LLMs (LLaMA, Mistral, Falcon)
- Understanding of LangChain Agents or Semantic Kernel pipelines - Familiarity with Angular and .NET ecosystems for end-to-end integration - Exposure to observability frameworks such as OpenTelemetry, Prometheus, or Grafana - Knowledge of enterprise data governance and AI compliance frameworks - Contributions to open-source AI projects or custom LLM integrations

What You Will Build

- A conversational analytics chatbot capable of generating real-time, compliant SQL queries
- RAG pipelines that fetch and embed domain knowledge across tenants - Context-aware AI microservices integrated with IRISSs monitoring and reporting systems - Evaluation dashboards for prompt performance, latency, and query accuracy - Continuous learning and feedback loops to improve the GenAI system over time

Development Environment

- Python 3.10+, FastAPI, LangChain
- AWS Bedrock, OpenSearch, Kendra, Lambda, ECS - Angular 18+ for embedded UIs - Node.js 16+, Yarn, VS Code - GitHub Actions and AWS CDK for CI/CD - Dockerized microservices architecture

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Iriss Inc

Cybersecurity

Tech City

RecommendedJobs for You