Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You are a seasoned AI Solution Architect with over 7 years of experience, including a strong background in Generative AI and Large Language Models (LLMs) such as OpenAI, Claude, and Gemini. Your role will involve architecting end-to-end LLM solutions for chatbot applications, semantic search, summarization, and domain-specific assistants. You will design modular, scalable LLM workflows, leverage Databricks Unity Catalog for centralized governance, and collaborate with data engineers to optimize model performance and deployment. **Key Responsibilities:** - Architect end-to-end LLM solutions for various applications. - Design modular, scalable LLM workflows. - Leverage Databricks Unity Catalog for centralized governance. - Collaborate with data engineers for dataset ingestion and optimization. - Integrate feedback loop systems for continuous refinement. - Optimize model performance, latency, and cost. - Oversee secure deployment of models in production. - Guide teams on data quality and responsible AI practices. **Qualifications Required:** - 7+ years in AI/ML solution architecture, with 2+ years focused on LLMs and Generative AI. - Strong experience with OpenAI, Claude, Gemini, and integrating LLM APIs. - Proficiency in Databricks, including Unity Catalog, Delta Lake, MLflow. - Deep understanding of data governance and metadata management. - Hands-on experience with chatbot frameworks and LLM orchestration tools. - Strong Python development skills. - Familiarity with feedback loops and continuous learning patterns. - Experience deploying models in cloud-native and hybrid environments. As an AI Solution Architect at NTT DATA, you will be part of a global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team across 50 countries and a focus on digital and AI infrastructure, NTT DATA is a trusted partner for organizations worldwide. As part of the NTT Group, a leader in R&D investment, you will contribute to shaping the digital future confidently and sustainably. ,
Posted 23 hours ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
NTT DATA strives to hire exceptional, innovative, and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking an AI Solution Architect to join our team in Bangalore, Karnataka (IN-KA), India (IN). Role: - AI Solution Architect Experience: - 7+ Years Notice Period: - 30-60 Days Project Overview: We are seeking a seasoned AI Architect with strong experience in Generative AI and Large Language Models (LLMs) including OpenAI, Claude, and Gemini to lead the design, orchestration, and deployment of intelligent solutions across complex use cases. You will architect conversational systems, feedback loops, and LLM pipelines with robust data governance, leveraging the Databricks platform and Unity Catalog for enterprise-scale scalability, lineage, and compliance. Role Scope / Deliverables: - Architect end-to-end LLM solutions for chatbot applications, semantic search, summarization, and domain-specific assistants. - Design modular, scalable LLM workflows including prompt orchestration, RAG (retrieval-augmented generation), vector store integration, and real-time inference pipelines. - Leverage Databricks Unity Catalog for: - Centralized governance of AI training and inference datasets - Managing metadata, lineage, access controls, and audit trails - Cataloging feature tables, vector embeddings, and model artifacts - Collaborate with data engineers and platform teams to ingest, transform, and catalog datasets used for fine-tuning and prompt optimization. - Integrate feedback loop systems (e.g., user input, signal-driven reinforcement, RLHF) to continuously refine LLM performance. - Optimize model performance, latency, and cost using a combination of fine-tuning, prompt engineering, model selection, and token usage management. - Oversee secure deployment of models in production, including access control, auditability, and compliance alignment via Unity Catalog. - Guide teams on data quality, discoverability, and responsible AI practices in LLM usage. Key Skills: - 7+ years in AI/ML solution architecture, with 2+ years focused on LLMs and Generative AI. - Strong experience working with OpenAI (GPT-4/o), Claude, Gemini, and integrating LLM APIs into enterprise systems. - Proficiency in Databricks, including Unity Catalog, Delta Lake, MLflow, and cluster orchestration. - Deep understanding of data governance, metadata management, and data lineage in large-scale environments. - Hands-on experience with chatbot frameworks, LLM orchestration tools (LangChain, LlamaIndex), and vector databases (e.g., FAISS, Weaviate, Pinecone). - Strong Python development skills, including notebooks, REST APIs, and LLM orchestration pipelines. - Ability to map business problems to AI solutions, with strong architectural thinking and stakeholder communication. - Familiarity with feedback loops and continuous learning patterns (e.g., RLHF, user scoring, prompt iteration). - Experience deploying models in cloud-native and hybrid environments (AWS, Azure, or GCP). Preferred Qualifications: - Experience fine-tuning or optimizing open-source LLMs (e.g., LLaMA, Mistral) with tools like LoRA/QLoRA. - Knowledge of compliance requirements (HIPAA, GDPR, SOC2) in AI systems. - Prior work building secure, governed LLM applications in highly regulated industries. - Background in data cataloging, enterprise metadata management, or ML model registries. About NTT DATA: NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be joining Birlasoft, a leading organization at the forefront of merging domain expertise, enterprise solutions, and digital technologies to redefine business outcomes. Emphasizing a consultative and design thinking approach, we drive societal progress by empowering customers to operate businesses with unparalleled efficiency and innovation. As a part of the esteemed multibillion-dollar CKA Birla Group, Birlasoft, comprising a dedicated team of over 12,500 professionals, is dedicated to upholding the Group's distinguished 162-year legacy. At our foundation, we prioritize Diversity, Equity, and Inclusion (DEI) practices, coupled with Corporate Sustainable Responsibility (CSR) initiatives, demonstrating our dedication to constructing not only businesses but also inclusive and sustainable communities. Come join us in shaping a future where technology seamlessly aligns with purpose. We are currently looking for a skilled and proactive StreamSets or Denodo Platform Administrator to manage and enhance our enterprise data engineering and analytics platforms. This position requires hands-on expertise in overseeing large-scale Snowflake data warehouses and StreamSets data pipelines, with a focus on robust troubleshooting, automation, and monitoring capabilities. The ideal candidate will ensure platform reliability, performance, security, and compliance while collaborating closely with various teams such as data engineers, DevOps, and support teams. The role will be based in Pune, Hyderabad, Noida, or Bengaluru, and requires a minimum of 5 years of experience. Key Requirements: - Bachelors or masters in computer science, IT, or related field (B.Tech. / MCA preferred). - Minimum of 3 years of hands-on experience in Snowflake administration. - 5+ years of experience managing StreamSets pipelines in enterprise-grade environments. - Strong familiarity with AWS services, particularly S3, IAM, Lambda, and EC2. - Working knowledge of ServiceNow, Jira, Git, Grafana, and Denodo. - Understanding of data modeling, ETL/ELT best practices, and modern data platform architectures. - Experience with DataOps, DevSecOps, and cloud-native deployment principles is advantageous. - Certification in Snowflake or AWS is highly desirable. If you possess the required qualifications and are passionate about leveraging your expertise in platform administration to drive impactful business outcomes, we invite you to apply and be part of our dynamic team at Birlasoft.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Senior Python Engineer at our company, you will leverage your deep expertise in data engineering and API development to drive technical excellence and autonomy. Your primary responsibility will be leading the development of scalable backend systems and data infrastructure that power AI-driven applications across our platform. You will design, develop, and maintain high-performance APIs and microservices using Python frameworks such as FastAPI and Flask. Additionally, you will build and optimize scalable data pipelines, ETL/ELT processes, and orchestration frameworks, ensuring the utilization of AI development tools like GitHub Copilot, Cursor, or CodeWhisperer to enhance engineering velocity and code quality. In this role, you will architect resilient and modular backend systems integrated with databases like PostgreSQL, MongoDB, and Elasticsearch. Managing workflows and event-driven architectures using tools such as Airflow, Dagster, or Temporal.io will be essential, as you collaborate with cross-functional teams to deliver production-grade systems in cloud environments (AWS/GCP/Azure) with high test coverage, observability, and reliability. To be successful in this position, you must have at least 5 years of hands-on experience in Python backend/API development, a strong background in data engineering, and proficiency in AI-enhanced development environments like Copilot, Cursor, or equivalent tools. Solid experience with Elasticsearch, PostgreSQL, and scalable data solutions, along with familiarity with Docker, CI/CD, and cloud-native deployment practices is crucial. You should also demonstrate the ability to take ownership of features from idea to production. Nice-to-have qualifications include experience with distributed workflow engines like Temporal.io, background in AI/ML systems (PyTorch or TensorFlow), familiarity with LangChain, LLMs, and vector search tools (e.g., FAISS, Pinecone), and exposure to weak supervision, semantic search, or agentic AI workflows. Join us to build infrastructure for cutting-edge AI products and work in a collaborative, high-caliber engineering environment.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You should have 6-10+ years of experience in AI/ML development, including at least 3 years of hands-on experience in Generative AI, RAG frameworks, and Agentic AI systems. As a Generative AI Engineer, you will be responsible for designing, developing, and optimizing RAG pipelines using frameworks such as LangChain, LlamaIndex, or custom-built stacks. You will also implement Agentic AI architectures involving task-based agents, stateful memory, planning-execution workflows, and tool augmentation. Additionally, you will perform model fine-tuning, embedding generation, and evaluation of LLM outputs while incorporating human and automated feedback loops. Building and enforcing guardrails to ensure safe, compliant, and robust model behavior will be a crucial part of your role, including prompt validation, output moderation, and access controls. Collaboration with cross-functional teams to deploy solutions in cloud-native environments such as Azure OpenAI, AWS Bedrock, or Google Vertex AI will also be expected. Furthermore, you will contribute to system observability via dashboards and logging, as well as support post-deployment model monitoring and optimization. To excel in this role, you must have proven production experience with RAG frameworks like LangChain, LlamaIndex, or custom-built solutions. A solid understanding of Agentic AI design patterns, strong expertise in LLM fine-tuning, vector embeddings, evaluation strategies, and feedback integration are essential. Experience with implementing AI guardrails, proficiency in Python, LLM APIs (OpenAI, Anthropic, Cohere, etc.), and vector database integration are also required. Familiarity with CI/CD pipelines, API integrations, and cloud-native deployment patterns is a plus. Preferred qualifications include experience working on AI projects in regulated environments (Banking domain), hands-on experience with cloud AI platforms like Azure OpenAI, AWS Bedrock, or Google Vertex AI, knowledge of prompt engineering, RLHF, and LLM observability frameworks, as well as experience building or leveraging internal LLM evaluation harnesses, agent orchestration layers, or compliance dashboards.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |