Jobs
Interviews

19 Bedrock Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

The ideal candidate should have hands-on experience in Collibra Workflows, asset model creation, cataloguing, and assessment creation. Additionally, exposure to AI platforms such as Open AI, Bedrock, and integration platforms like snaplogic and mulesoft is required. A deep understanding and practical knowledge of IDEs such as Eclipse/PyCharm or any Workflow Designer is essential. Experience with one or more of the following languages: Java, JavaScript, Groovy, Python is preferred. Moreover, deep understanding and hands-on experience of CICD processes and tooling e.g., GitHub is necessary. Candidates should have experience working in Dev-Ops teams based on Kubernetes tools and converting a business workflow into an automated set of actions. Proven knowledge in scripting and a willingness to learn new languages is expected. Excellent communication skills in written & spoken English, interpersonal skills, and a collaborative approach to delivery are crucial. An enthusiasm for great documentation including high level designs, low level designs, coding standards, and Knowledge Base Articles is highly appreciated. Desirable qualifications include an Engineering Degree in IT/Computer Science with a minimum of 10 years of experience. Knowledge and experience of the Collibra Data Governance platform, exposure to AI models, AI governance, data policies, and governance are advantageous. Basic AWS knowledge is a plus. Familiarity with integration technologies like Mulesoft and Snaplogic is beneficial. Excellent Jira skills, including the ability to rapidly generate JQL on-the-fly and save JQL queries/filters/views/etc for publishing to fellow engineers & senior stakeholders, are desired. Candidates should have experience in the creation of documentation in Confluence and Agile practices, preferably having been part of an Agile team for several years. Joining Virtusa means becoming part of a team that values teamwork, quality of life, professional and personal development. With a global team of 27,000 people, Virtusa aims to provide exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with us. Great minds, great potential, and a dynamic environment await you at Virtusa, where collaboration and excellence are nurtured.,

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Pune

Hybrid

Greetings from Intelliswift- An LTTS Company Role : Fullstack Developer Work Location:- Pune Experience:- 5+ years Job Description in details: Job Summary Role : Fullstack Developer Experience : 5 to 8 Years Job Location : Pune As a Fullstack Developer specializing in generative AI and cloud technologies, you will design, build, and maintain end-to-end applications on AWS. Youll leverage services such as Bedrock, SageMaker, LangChain and Amplify to integrate AI/ML capabilities, architect scalable infrastructure, and deliver seamless front-end experiences using React. Youll partner with UX/UI designers, ML engineers, DevOps teams, and product stakeholders to take features from concept through production deployment. Job Description: 5+ years of professional experience as a Fullstack Developer building scalable web applications. Proficiency in Python and/or JavaScript/TypeScript; strong command of modern frameworks (React, Node.js). Hands-on AWS expertise: Bedrock, SageMaker, Amplify, Lambda, API Gateway, DynamoDB/RDS, CloudWatch, IAM, VPC. Architect & develop full-stack solutions using React for front-end, Python/Node.js for back-end, and AWS Lambda/API Gateway or containers for serverless services. Integrate Generative AI capabilities leveraging AWS Bedrock, LangChain retrieval-augmented pipelines, and custom prompt engineering to power intelligent assistants and data-driven insights. Design & Manage AWS Infrastructure using CDK/CloudFormation for VPCs, IAM policies, S3, DynamoDB/RDS, ECS/EKS, and Implement DevOps/MLOps Workflows: establish CI/CD pipelines (CodePipeline, CodeBuild, Jenkins), containerization (Docker), automated testing, and rollout strategies. Develop Interactive UIs in React: translate Figma/Sketch designs into responsive components, integrate with backend APIs, and harness AWS Amplify for accelerated feature delivery. Solid understanding of AI/ML concepts, including prompt engineering, generative AI frameworks (LangChain), and model deployment patterns. Experience designing and consuming APIs: RESTful and GraphQL. DevOps/MLOps skills: CI/CD pipeline creation, containerization (Docker), orchestration (ECS/EKS), infrastructure as code. Cloud architecture know-how: security groups, network segmentation, high-availability patterns, cost optimization. Excellent problem-solving ability and strong communication skills to collaborate effectively across distributed teams. Share your updated profiles on shakambnari.nayak@intelliswift.com with details.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

surat, gujarat

On-site

At devx, we specialize in assisting some of India's most innovative brands in unlocking growth opportunities through AI-powered and cloud-native solutions developed in collaboration with AWS. As a rapidly growing consultancy, we are dedicated to addressing real-world business challenges by leveraging cutting-edge technology. We are currently seeking a proactive and customer-focused AWS Solutions Architect to become part of our team. In this position, you will collaborate directly with clients to craft scalable, secure, and economical cloud architectures that effectively tackle significant business obstacles. Your role will involve bridging the gap between business requirements and technical implementation, establishing yourself as a reliable advisor to our clients. Key Responsibilities - Engage with clients to comprehend their business goals and transform them into cloud-based architectural solutions. - Design, deploy, and document AWS architectures, emphasizing scalability, security, and performance. - Develop solution blueprints and collaborate closely with engineering teams to ensure successful execution. - Conduct workshops, presentations, and in-depth technical discussions with client teams. - Stay informed about the latest AWS offerings and best practices, integrating them into solution designs. - Collaborate with various teams, including sales, product, and engineering, to provide comprehensive solutions. We are looking for individuals with the following qualifications: - Minimum of 2 years of experience in designing and implementing solutions on AWS. - Proficiency in fundamental AWS services like EC2, S3, Lambda, RDS, API Gateway, IAM, VPC, etc. - Proficiency in fundamental AI/ML and Data services such as Bedrock, Sagemaker, Glue, Athena, Kinesis, etc. - Proficiency in fundamental DevOps services such as ECS, EKS, CI/CD Pipeline, Fargate, Lambda, etc. - Excellent written and verbal communication skills in English. - Comfortable in client-facing capacities, with the ability to lead technical dialogues and establish credibility with stakeholders. - Capability to strike a balance between technical intricacies and business context, effectively communicating value to decision-makers. Location: Surat, Gujarat Note: This is an on-site role in Surat, Gujarat. Please apply only if you are willing to relocate.,

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

4+ years of experience in Java development with Spring Boot. Experience integrating AI/ML models into backend systems. Must have experience in any of the GenAI tools like Bedrock/ OpenAi/Tensorflow/Ai21/Anthropic/Cohere/Stability OR Scikit-Learn Strong understanding of RESTful API design and microservices. Familiarity with AI/ML tools and frameworks (e.g., Python, TensorFlow, Scikit-learn). Experience with cloud platforms (AWS, GCP, or Azure). Knowledge of containerization (Docker, Kubernetes) and event-driven architectures. Preferred Qualifications Experience with GenAI platforms (e.g., AWS Bedrock, OpenAI) Understanding of MLOps practices and model lifecycle management. Bachelors or Masters degree in Computer Science, Engineering, or related field.

Posted 1 week ago

Apply

3.0 - 6.0 years

0 - 3 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Role & responsibilities Implement and manage AIOps platforms for intelligent monitoring, alerting, anomaly detection, and root cause analysis (RCA). Possess end-to-end knowledge of VLLM model hosting and inferencing. Advanced knowledge of public cloud platforms such as AWS and Azure. Build and maintain machine learning pipelines and models for predictive maintenance, anomaly detection, and noise reduction. Experience in production support and real-time issue handling. Design dashboards and visualizations to provide operational insights to stakeholders. Working knowledge of Bedrock, SageMaker, EKS, Lambda, etc. 1 to 2 years of experience with Jenkins and GoCD to make build/deploy pipelines. Hands-on experience with open-source and self-hosted model APIs using SDKs. Drive data-driven decisions by analyzing operational data and generating reports on system health, performance, and availability. Basic knowledge of kserve and rayserve inferencing . Good knowledge of high level scaling using Karpenter , Keda , System based vertical/horizontal scaling. Strong knowledge on linux operating system or linux certified . Previous experience with Helm chart deployments and Terraform template and module creation is highly recommended. Secondary Responsibilities: Proven experience in AIOps and DevOps, with a strong background in cloud technologies (AWS, Azure, Google Cloud). Proficiency in tools such as Kubeflow, Kserve, ONNX, and containerization technologies (Docker, Kubernetes). Experience with enterprise-level infrastructure, including tools like terraform, helm, and On-Prem servers hosting. Previous experience in fintech or AI based tech companies are highly desirable. Demonstrates the ability to manage workloads effectively in a production environment. Possesses excellent communication and collaboration skills, with a strong focus on cross-functional teamwork.

Posted 1 week ago

Apply

5.0 - 10.0 years

50 - 60 Lacs

Bengaluru

Work from Office

Job Title: AI/ML Architect GenAI, LLMs & Enterprise Automation Location: Bangalore Experience: 8+ years (including 4+ years in AI/ML architecture on cloud platforms) Role Summary We are seeking an experienced AI/ML Architect to define and lead the design, development, and scaling of GenAI-driven solutions across our learning and enterprise platforms. This is a senior technical leadership role where you will work closely with the CTO and product leadership to architect intelligent systems powered by LLMs, RAG pipelines, and multi-agent orchestration. You will own the AI solution architecture end-to-endfrom model selection and training frameworks to infrastructure, automation, and observability. The ideal candidate will have deep expertise in GenAI systems and a strong grasp of production-grade deployment practices across the stack. Must-Have Skills AI/ML solution architecture experience with production-grade systems Strong background in LLM fine-tuning (SFT, LoRA, PEFT) and RAG frameworks Experience with vector databases (FAISS, Pinecone) and embedding generation Proficiency in LangChain, LangGraph , LangFlow, and prompt engineering Deep cloud experience (AWS: Bedrock, ECS, Lambda, S3, IAM) Infra automation using Terraform, CI/CD via GitHub Actions or CodePipeline Backend API architecture using FastAPI or Node.js Monitoring & observability using Langfuse, LangWatch, OpenTelemetry Python, Bash scripting, and low-code/no-code tools (e.g., n8n) Bonus Skills Hands-on with multi-agent orchestration frameworks (CrewAI, AutoGen) Experience integrating AI/chatbots into web, mobile, or LMS platforms Familiarity with enterprise security, data governance, and compliance frameworks Exposure to real-time analytics and event-driven architecture You’ll Be Responsible For Defining the AI/ML architecture strategy and roadmap Leading design and development of GenAI-powered products and services Architecting scalable, modular, and automated AI systems Driving experimentation with new models, APIs, and frameworks Ensuring robust integration between model, infra, and app layers Providing technical guidance and mentorship to engineering teams Enabling production-grade performance, monitoring, and governance

Posted 1 week ago

Apply

4.0 - 6.0 years

15 - 20 Lacs

Bengaluru

Work from Office

At MAXIMUS, we are growing our Digital Solutions team to better serve our organization and our clients in the government, health, and human services space. We believe that great outcomes define our success. We like to turn bold ideas into delightful solutions. We use methodology grounded in design thinking, lean, and agile to help solve complicated problems in a cost effective, rapid and precise manner. As a part of our India based AI organization, this position is responsible in design, development, implementation, and maintenance of AWS Textract-based solutions. The candidate will have following experience: 4+ years of experience in Python development. Proven experience with Amazon Textract, including hands-on experience in integrating and optimizing Textract-based solutions. Strong knowledge of AWS services, including S3, Lambda, IAM, and DynamoDB. Experience with RESTful APIs and microservices architecture. Familiarity with document processing and OCR technologies. Solid understanding of software development best practices, including version control, testing, and code review processes. Strong problem-solving skills and attention to detail. Experience with other AWS AI/ML services like SageMaker, Rekognition, Comprehend, Bedrock. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Understanding of data privacy and security best practices. Experience with Agile/Scrum methodologies. Excellent communication and collaboration skills. Ideal candidate will demonstrate: Good mix of hands-on experience in above areas. Strong analytical skills and experience working with business stakeholders to analyze business requirements. Ability to quickly pickup new technology and present value proposition. Excellent communication and technical presentation skills. Essential Duties and Responsibilities: Develop and maintain Python-based applications and scripts that utilize Amazon Textract for document data extraction. Integrate Textract with other AWS services such as S3, Lambda, and DynamoDB. Optimize and scale Textract processing workflows to handle large volumes of documents efficiently. Collaborate with data engineers and product managers to refine document processing solutions. Implement error handling and monitoring to ensure the reliability of data extraction processes. Write and maintain comprehensive documentation for the developed solutions. Stay updated on the latest features and best practices related to Amazon Textract and other AWS technologies. Collaborate with business analysts, stakeholders, and other team members to gather requirements, define project scope, and ensure alignment with business objectives. Work closely with QA teams to ensure thorough testing and validation of Appian applications. Ensure production issues are resolved in a timely manner. Work closely with Maximus Onsite Architects, Delivery Leads for solutioning of new projects and creating work estimates. Stay online and frequently collaborate with onsite stakeholders using Maximus MS Teams during 1st half of the day in EST.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a GenAI Developer at Vipracube Tech Solutions, you will be responsible for developing and optimizing AI models, implementing AI algorithms, collaborating with cross-functional teams, conducting research on emerging AI technologies, and deploying AI solutions. This full-time role requires 5 to 6 years of experience and is based in Pune, with the flexibility of some work from home. Your key responsibilities will include fine-tuning large language models tailored to marketing and operational use cases, building Generative AI solutions using various platforms like OpenAI (GPT, DALLE, Whisper) and Agentic AI platforms such as LangGraph and AWS Bedrock. You will also be building robust pipelines using Python, NumPy, Pandas, applying traditional ML techniques, handling CI/CD & MLOps, using AWS Cloud Services, collaborating using tools like Cursor, and effectively communicating with stakeholders and clients. To excel in this role, you should have 5+ years of relevant AI/ML development experience, a strong portfolio of AI projects in marketing or operations domains, and a proven ability to work independently and meet deadlines. Join our dynamic team and contribute to creating smart, efficient, and future-ready digital products for businesses and startups.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

14 - 22 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Hiring For Top IT Company- Designation: Python Developer Skills: Python + Pyspark Location :Bang/Mumbai Exp: 5-8 yrs Best CTC 9783460933 9549198246 9982845569 7665831761 6377522517 7240017049 Team Converse

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Kolkata, Hyderabad, Bengaluru

Work from Office

Responsibilities Design, develop, and deploy scalable AI/ML solutions using AWS services such as Amazon Bedrock, SageMaker, Amazon Q, Amazon Lex, Amazon Connect, and Lambda. Implement and optimize large language model (LLM) applications using Amazon Bedrock, including prompt engineering, fine-tuning, and orchestration for specific business use cases. Build and maintain end-to-end machine learning pipelines using SageMaker for model training, tuning, deployment, and monitoring. Integrate conversational AI and virtual assistants using Amazon Lex and Amazon Connect, with seamless user experiences and real-time inference. Leverage AWS Lambda for event-driven execution of model inference, data preprocessing, and microservices. Design and maintain scalable and secure data pipelines and AI workflows, ensuring efficient data flow to and from Redshift and other AWS data stores. Implement data ingestion, transformation, and model inference for structured and unstructured data using Python and AWS SDKs. Collaborate with data engineers and scientists to support development and deployment of ML models on AWS. Monitor AI/ML applications in production, ensuring optimal performance, low latency, and cost efficiency across all AI/ML services. Ensure implementation of AWS security best practices, including IAM policies, data encryption, and compliance with industry standards. Drive the integration of Amazon Q for enterprise AI-based assistance and automation across internal processes and systems. Participate in architecture reviews and recommend best-fit AWS AI/ML services for evolving business needs. Stay up to date with the latest advancements in AWS AI services, LLMs, and industry trends to inform technology strategy and innovation. Prepare documentation for ML pipelines, model performance reports, and system architecture. Qualifications we seek in you: Minimum Qualifications Proven hands-on experience with Amazon Bedrock, SageMaker, Lex, Connect, Lambda, and Redshift. Strong knowledge and application experience with Large Language Models (LLMs) and prompt engineering techniques. Experience building production-grade AI applications using AWS AI or other generative AI services. Solid programming experience in Python for ML development, data processing, and automation. Proficiency in designing and deploying conversational AI/chatbot solutions using Lex and Connect. Experience with Redshift for data warehousing and analytics integration with ML solutions. Good understanding of AWS architecture, scalability, availability, and security best practices. Familiarity with AWS development, deployment, and monitoring tools (CloudWatch, CodePipeline, etc.). Strong understanding of MLOps practices including model versioning, CI/CD pipelines, and model monitoring. Strong communication and interpersonal skills to collaborate with cross-functional teams and stakeholders. Ability to troubleshoot performance bottlenecks and optimize cloud resources for cost-effectiveness Preferred Qualifications: AWS Certification in Machine Learning, Solutions Architect, or AI Services. Experience with other AI tools (e.g., Anthropic Claude, OpenAI APIs, or Hugging Face). Knowledge of streaming architectures and services like Kafka or Kinesis.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

18 - 30 Lacs

Bengaluru

Work from Office

Roles and Responsibilities Design, develop, and deploy data science models using Python, AWS, and LLM (Large Language Model) technologies. Collaborate with cross-functional teams to identify business problems and design solutions that leverage gen AI capabilities. Develop scalable data pipelines using Bedrock framework to integrate various data sources into SageMaker model development. Conduct exploratory data analysis, feature engineering, and model evaluation to ensure high accuracy and reliability of results. Provide technical guidance on best practices for implementing machine learning algorithms in production environments. Analyze large, complex healthcare datasets, including electronic health records (EHR) and claims data. Develop statistical models for patient risk stratification, treatment optimization, population health management, and revenue cycle optimization. Build models for clinical decision support, patient outcome prediction, care quality improvement, and revenue cycle optimization Create and maintain automated data pipelines for real-time analytics and reporting Work with healthcare data standards (HL7 FHIR, ICD-10, CPT, SNOMED CT) and ensure regulatory compliance. Develop and deploy models in cloud environments while creating visualizations for stakeholders Present findings and recommendations to cross-functional teams including clinicians, product managers, and executives

Posted 3 weeks ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Architect Location: Bangalore ,Hyderabad,Chennai, Pune, Gurgaon - hybrid- 2/3 days WFO Experience: 8+ years Position Overview: We are seeking a highly skilled and strategic Data Architect to design, build, and maintain the organizations data architecture. The ideal candidate will be responsible for aligning data solutions with business needs, ensuring data integrity, and enabling scalable and efficient data flows across the enterprise. This role requires deep expertise in data modeling, data integration, cloud data platforms, and governance practices. Key Responsibilities: Architectural Design: Define and implement enterprise data architecture strategies, including data warehousing, data lakes, and real-time data systems. Data Modeling: Develop and maintain logical, physical, and conceptual data models to support analytics, reporting, and operational systems. Platform Management: Select and oversee implementation of cloud and on-premises data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse, Databricks). Integration & ETL: Design robust ETL/ELT pipelines and data integration frameworks using tools such as Apache Airflow, Informatica, dbt, or native cloud services. Data Governance: Collaborate with stakeholders to implement data quality, data lineage, metadata management, and security best practices. Collaboration: Work closely with data engineers, analysts, software developers, and business teams to ensure seamless and secure data access. Performance Optimization: Tune databases, queries, and storage strategies for performance, scalability, and cost-efficiency. Documentation: Maintain comprehensive documentation for data structures, standards, and architectural decisions. Required Qualifications: Bachelors or master’s degree in computer science, Information Systems, or a related field. 5+ years of experience in data architecture, data engineering, or database development. Strong expertise in data modeling, relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). Experience with modern data platforms and cloud ecosystems (AWS, Azure, or GCP). Hands-on experience with data warehousing solutions and tools (e.g., Snowflake, Redshift, BigQuery). Proficiency in SQL and data scripting languages (e.g., Python, Scala). Familiarity with data privacy regulations (e.g., GDPR, HIPAA) and security standards. Tech Stack AWS Cloud – S3, EC2, EMR, Lambda, IAM, Snowflake DB Databricks Spark/Pyspark, Python Good Knowledge of Bedrock and Mistral AI RAG & NLP LangChain and LangRAG LLMs Anthropic Claude, Mistral, LLaMA etc.,

Posted 4 weeks ago

Apply

3.0 - 5.0 years

20 - 35 Lacs

Pune

Hybrid

Role Overview :- Monitor, evaluate, and optimize AI/LLM workflows in production environments. Ensure reliable, efficient, and high-quality AI system performance by building out an LLM Ops platform that is self-serve for the engineering and data science departments. Key Responsibilities:- Collaborate with data scientists and software engineers to integrate an LLM Ops platform (Opik by CometML) for existing AI workflows Identify valuable performance metrics (accuracy, quality, etc) for AI workflows and create on-going sampling evaluation processes using the LLM Ops platform that alert when metrics drop below thresholds Cross-team collaboration to create datasets and benchmarks for new AI workflows Run experiments on datasets and optimize performance via model changes and prompt adjustments Debug and troubleshoot AI workflow issues Optimize inference costs and latency while maintaining accuracy and quality Develop automations for LLM Ops platform integration to empower data scientists and software engineers to self-serve integration with the AI workflows they build Requirements:- Strong Python programming skills Experience with generative AI models and tools (OpenAI, Anthropic, Bedrock, etc) Knowledge of fundamental statistical concepts and tools in data science such as: heuristic and non-heuristic measurements in NLP (BLEU, WER, sentiment analysis, LLM-as-judge, etc), standard deviation, sampling rate, and a high level understanding of how modern AI models work (knowledge cutoffs, context windows, temperature, etc) Familiarity with AWS Understanding of prompt engineering concepts People skills: you will be expected to frequently collaborate with other teams to help to perfect their AI workflows Experience Level 3-5 years of experience in LLM/AI Ops, MLOps, Data Science, or MLE

Posted 1 month ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Architect Location: Bangalore ,Hyderabad,Chennai, Pune, Gurgaon - hybrid- 2/3 days WFO Experience: 8+ years Position Overview: We are seeking a highly skilled and strategic Data Architect to design, build, and maintain the organizations data architecture. The ideal candidate will be responsible for aligning data solutions with business needs, ensuring data integrity, and enabling scalable and efficient data flows across the enterprise. This role requires deep expertise in data modeling, data integration, cloud data platforms, and governance practices. Key Responsibilities: Architectural Design: Define and implement enterprise data architecture strategies, including data warehousing, data lakes, and real-time data systems. Data Modeling: Develop and maintain logical, physical, and conceptual data models to support analytics, reporting, and operational systems. Platform Management: Select and oversee implementation of cloud and on-premises data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse, Databricks). Integration & ETL: Design robust ETL/ELT pipelines and data integration frameworks using tools such as Apache Airflow, Informatica, dbt, or native cloud services. Data Governance: Collaborate with stakeholders to implement data quality, data lineage, metadata management, and security best practices. Collaboration: Work closely with data engineers, analysts, software developers, and business teams to ensure seamless and secure data access. Performance Optimization: Tune databases, queries, and storage strategies for performance, scalability, and cost-efficiency. Documentation: Maintain comprehensive documentation for data structures, standards, and architectural decisions. Required Qualifications: Bachelors or master’s degree in computer science, Information Systems, or a related field. 5+ years of experience in data architecture, data engineering, or database development. Strong expertise in data modeling, relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). Experience with modern data platforms and cloud ecosystems (AWS, Azure, or GCP). Hands-on experience with data warehousing solutions and tools (e.g., Snowflake, Redshift, BigQuery). Proficiency in SQL and data scripting languages (e.g., Python, Scala). Familiarity with data privacy regulations (e.g., GDPR, HIPAA) and security standards. Tech Stack AWS Cloud – S3, EC2, EMR, Lambda, IAM, Snowflake DB Databricks Spark/Pyspark, Python Good Knowledge of Bedrock and Mistral AI RAG & NLP LangChain and LangRAG LLMs Anthropic Claude, Mistral, LLaMA etc.,

Posted 1 month ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Scientist Architect Location: Pan India - hybrid Experience: 8+ years Position Overview: We are seeking a Data Scientist Architect to lead and drive data science and architecture initiatives within Brillio. The ideal candidate will have a deep understanding of data science, data engineering, and architecture, and be highly proficient in implementing cutting-edge solutions using tools like DataBricks, AWS, and Bedrock/Mistral . The role requires an individual with extensive experience in designing, building, and deploying large-scale data systems and machine learning models, along with the ability to lead and mentor cross-functional teams. As a Data Scientist Architect, you will have the opportunity to innovate and make a lasting impact across our diverse client base, providing them with tailored solutions that drive their data strategy forward. Key Responsibilities: Lead Data Architecture & Science Initiatives: Design and implement advanced data architectures and solutions to support complex data science and machine learning workflows. Build and deploy scalable, production-grade data pipelines and models leveraging cloud platforms like AWS and tools like DataBricks. Architect solutions involving large-scale data ingestion, transformation, and storage, focusing on performance, scalability, and reliability. Platform Development & Integration: Implement and manage cloud-based infrastructure for data engineering, analytics, and machine learning on platforms like AWS, leveraging services like S3, Lambda, EC2, etc. Work with Bedrock/Mistral to deploy and manage machine learning models at scale, ensuring continuous optimization and improvement. Skills and Qualifications: Experience: 8+ years of experience in Data Science, Data Architecture with a focus on large-scale data systems and cloud platforms. Proven track record of leading data science architecture projects from inception to deployment. Technical Skills: Proficiency in DataBricks, AWS (S3, EC2, Lambda, Redshift, SageMaker, etc.), and Bedrock/Mistral.

Posted 1 month ago

Apply

6.0 - 9.0 years

14 - 22 Lacs

Pune, Chennai

Work from Office

Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job Description: Agentic AI solution designer: Strong one. .Someone who is well verse with AWS bedrock agentic AI solution. However at same time also able to support other open source agentic frameworks and hyperscaler based solution design. This person has to be vocal, good communication as will be working with client delivery team in close proximity. AI consultant : hand on expert that can look at the design and convert to the prototyping in 2-3 week time frame so that it could be demoed to client. AWS Bedrock Competence: Demonstrated proficiency in utilizing and integrating with Amazon Bedrock, including a strong understanding of its foundation models and capabilities. GenAI Agents: Deep understanding of the concepts, architectures, and practical implementation of Generative AI Agents, including experience with relevant frameworks and methodologies. Python Language: Excellent programming skills in Python, with experience in developing and deploying applications. Solution Design Leadership: Lead the technical design and architecture of innovative GenAI Agents App solutions tailored to meet specific customer requirements and business objectives. Customer Engagement: Act as the primary technical point of contact during the pre-sales process, effectively communicating the value proposition and technical capabilities of our GenAI Agents offerings to both technical and non-technical audiences. Requirements Gathering: Deeply understand customer business challenges and translate them into clear, concise technical requirements and solution specifications. Proposal Development: Create compelling technical proposals, presentations, and demonstrations that showcase our GenAI Agents App solutions and their potential impact. Technical Consultation & collaboration: Work closely and provide expert technical guidance and support to the sales team service delivery and customers units throughout the pre-sales cycle. Solution Presentation: Confidently present proposed solutions, address technical questions, and articulate the technical advantages of our offerings. Hands-on Development: Actively participate in the design, development, and prototyping of new features, functionalities, and improvements for our GenAI Agents platform. Proof of Concept (POC) Development: Build and demonstrate functional POCs to validate technical feasibility and explore new application areas for GenAI Agents. Code Contribution: Write efficient Python code for R&D projects.

Posted 1 month ago

Apply

5.0 - 8.0 years

0 - 2 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

AI/ML engineer or GenAI developer who will leverage Amazon Bedrock to build and optimize intelligent systems for automated email organization, with a strong emphasis on prompt engineering, collaboration, testing, and adherence to best practices. Develops and fine-tunes Bedrock-based GenAI models for email categorization Implements prompt engineering techniques for accurate email processing Collaborates with data engineers to optimize email categorization workflows Conducts testing to refine prompt structures and improve model responses Enables compliance with AI governance and security best practices Works with business users to align AI-generated responses with organizational needs

Posted 2 months ago

Apply

6.0 - 8.0 years

15 - 18 Lacs

Hyderabad

Work from Office

Job Title: Amazon Connect Development Expert Location: Hyderabad (On-site/Hybrid as applicable) Notice : Immediate to 15 days Job Summary: We are looking for an experienced Amazon Connect Development Expert to design and implement cloud-native contact center solutions using AWS services. This role demands strong expertise in Amazon Connect, Lex, Lambda, and generative AI services like Amazon Q or Bedrock. If youre passionate about building intelligent, scalable, and secure contact center solutions, this is the perfect opportunity to leverage your skills. Key Responsibilities: Architect, develop, and deploy cloud-based contact center solutions using Amazon Connect . Integrate Amazon Lex to create intelligent IVR and conversational flows. Build and manage backend logic using AWS Lambda (Node.js, Python, etc.) for real-time processing. Enhance customer interactions by integrating Amazon Q / Bedrock for generative AI capabilities. Collaborate with cross-functional teams to gather requirements and deliver end-to-end solutions. Ensure scalability, security, and availability using AWS best practices. Monitor, troubleshoot, and optimize solutions with Amazon CloudWatch and other tools. Mandatory Skills: Amazon Connect: Deep expertise in contact flow design, routing profiles, and integrations. Amazon Lex: Hands-on experience building voice and chat bots. AWS Lambda: Proficiency in writing serverless functions ( Node.js, Python ). Amazon Q / Bedrock: Experience in integrating generative AI into customer support workflows. Strong programming skills in Node.js or Python . Good to Have Skills: Amazon DynamoDB: NoSQL data modeling and querying. Amazon S3: Data storage and management for call recordings/logs. Amazon CloudWatch: Monitoring, logging, and alert setup. AWS Secrets Manager: Secure access management. Amazon Kinesis: Real-time streaming and analytics. Amazon EventBridge: Event-driven architecture and integrations. Amazon API Gateway: API development and security. Qualifications: Bachelor’s degree in Computer Science , Engineering, or a related field (or equivalent experience). Minimum of 3 years hands-on experience with Amazon Connect and related AWS services. AWS Certifications (e.g., AWS Certified Developer, Solutions Architect) are a plus. Preferred Traits: Strong analytical and problem-solving abilities. Excellent communication and documentation skills. Comfortable working in Agile/Scrum environments

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies