Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Customers & Products Job Family Group: Project Management Group Job Description: As bp transitions to a coordinated energy company, we must adapt to a changing world and maintain driven performance. bps customers & products (C&P) business area is setting up a business and technology centre (BTC) in CITY, COUNTRY . This will support the delivery of an enhanced customer experience and drive innovation by building global capabilities at scale, using technology, and developing deep expertise . The BTC will be a core and connected part of our business, bringing together colleagues who report into their respective part of C&P, working together with other functions across bp. This is an exciting time to join bp and the customers & products BTC! Job Title: Data Modeller SME Lead About the role: As the Data Modeller Senior SME for Castrol you will collaborate with business partners across Digital Operational Excellence, Technology, and Castrols PUs, HUBs, functions, and Markets to model and sustain curated datasets within the Castrol Data ecosystem. The role ensures agile, continuous improvement of curated datasets aligned with the Data Modelling Framework and supports analytics, data science, operational MI, and the broader Digital Business Strategy. On top of the Data lake we now have enabled the MLOPS environment (PySpark Pro) and Gurobi with direct connections to run the advance analytics and data science queries and algorithms written in python. This enables the data analyst and data science team to incubate insights in an agile way. The Data Modeller role will chip in and enable the growth trajectory on data science skills and capabilities within the role, the team and the wider Castrol data analyst/science community, data science experience is a plus but basic skills would suffice to start. Experience & Education: Education: Degree in an analytical field (preferably IT or engineering) or 5+ years of relevant experience Experience: Proven track record in delivering data models and curated datasets for major transformation projects. Broad understanding of multiple data domains and their integration points. Strong problem-solving and collaborative skills with a strategic approach. Skills & Competencies: Expertise in data modeling, data wrangling of highly complex, high-dimensional data (ER Studio, Gurobi, SageMaker PRO). Proficiency in translating analytical insights from high-dimensional data. Skilled in PowerBI data modeling and proof of concept design for data and analytics dashboarding. Proficiency in Data Science tools such as Python, Amazon SageMaker, GAMS, AMPL, ILOG, AIMMS, or similar. Ability to work across multiple levels of detail, including Analytics, MI, statistics, data, process design principles, operating model intent, and systems design. Strong influencing skills to use expertise and experience to shape value delivery. Demonstrated success in multi-functional deployments and performance optimization. BP Behaviors for Successful Delivery: Respect: Build trust through clear relationships Excellence : Apply standard processes and strive for executional completion One Team: Collaborate to improve team efficiency You will work with: You will be part of a 20 member Global Data & Analytics Team. You will operate peer to peer in a team of global seasoned experts on Process, Data, Advanced Analytics and Data Science. The Global Data & Analytics team reports into the Castrol Digital Enablement team that is managing the digital estate for Castrol where we enhance scalability, process and data integration. This D&A team is the driving force behind the Data & Analytics strategy managing the Harmonized Data Lake and the Business Intelligence derived from it, in support of the Business strategy and is a key pilar of value enablement through fast and accurate insights. As the Data Modeller SME lead you will be exposed to a wide variety of collaborators in all layers of the Castrol Leadership and our partners in GBS and Technology. Through Data Governance at Value centre you have great exposure to the operations and have the ability to influence and inspire change through value preposition engagements. Travel Requirement Negligible travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is not available for remote working Skills: Change control, Commissioning, start-up and handover, Conflict Management, Construction, Cost estimating and cost control (Inactive), Design development and delivery, Frameworks and methodologies, Governance arrangements, Performance management, Portfolio Management, Project and construction safety, Project execution planning, Project HSSE, Project Leadership, Project Team Management, Quality, Requirements Management, Reviews, Risk Management, Schedule and resources, Sourcing Management, Stakeholder Management, Strategy and business case, Supplier Relationship Management Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less
Posted 3 days ago
3.0 - 7.0 years
3 - 7 Lacs
Mumbai, Maharashtra, India
On-site
Support investment portfolio reporting and quantitative analyses for a leading alternative asset manager. Develop and improve workflow efficiency through automation. Assist with development, calibration, prototyping, and documentation of risk models. Assist with preparation of materials for client and senior management. Assist with the development of a robust risk management framework and analytics across portfolios. Analyse time series data to identify and report any trends or errors/exceptions, using quantitative techniques and machine learning models. Develop and maintain dynamic dashboards in risk reporting. Key Skills: Experienced in data science, machine learning, and quantitative modelling. Proficient in programming languages like Python, SQL, and libraries like Pandas, Numpy, Matplotlib. Experienced in Amazon SageMaker, and has a commanding grasp of Excel. Experienced in web scraping, using libraries like Selenium, Requests. Basic knowledge of BI tools like Tableau/Sigma Computing. Experience with risk analytic platforms (e.g., FactSet, Risk Metrics, Bloomberg).
Posted 3 days ago
6.0 - 7.0 years
7 - 15 Lacs
Chennai, Bengaluru
Work from Office
Experience with services like AWS (EC2, ECS, S3, CloudWatch, SQS , SNS, AWS Lambda , AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, RDS, ELB, AWS Aurora) and API Gateway is preferred.
Posted 4 days ago
10.0 - 12.0 years
1 - 3 Lacs
Hyderabad, Telangana, India
On-site
Job description Experienced AI/ML Engineer with expertise in Machine Learning, Deep Learning, NLP, and Generative AI. strong expertise in LLMs, Retrieval-Augmented Generation (RAG), Agentic AI, and MLOps to develop scalable and production-ready AI solutions. Key Responsibilities: Develop & Optimize AI/ML Solutions: Build and deploy LLMs, RAG, Agentic AI, and GenAI applications. ML Pipeline Development: Implement and automate scalable ML pipelines using MLflow, AWS SageMaker, Databricks, and PySpark. MLOps Implementation: Establish CI/CD workflows, model versioning, monitoring, and automated retraining. Cloud & Infrastructure: Leverage AWS AI/ML services (Sagemaker, Bedrock, Lambda, Step Functions, ECS/EKS) for scalable AI solutions. Data Engineering with PySpark: Optimize large-scale ETL workflows, data pipelines, and distributed data processing. LLMs & RAG Applications: Fine-tune and deploy LLMs integrated with vector databases (FAISS, Pinecone, ChromaDB). NLP & Deep Learning: Work with transformers, embeddings, multi-modal AI models, and text processing frameworks. Orchestration & Containerization: Deploy and manage AI workloads using Kubernetes (EKS), Docker, and CI/CD pipelines.Senior Data Engineer with special emphasis and experience of 10 to 12 years of IT and 6 to 8 years of experience on Artificial Intelligence, Deep learning and Machine Learning. bachelor degree in computer science engineering or related field. Able to Lead AI/ML team and working closely with all the team members and client. Strong Hands on Experience on Python coding and all the python and Data science libraries. Manage and direct processes and R&D (research and development) to meet the needs of our AI strategy. Understand company and client challenges and how integrating AI capabilities can help lead to solutions As a Machine Learning Engineer, you will play a crucial role in the development and implementation of cutting-edge artificial intelligence products. Hands on experience on all AWS service. Lead cross-functional teams in identifying and prioritizing key areas of a partner business where AI solutions can drive significant business benefit Analyse and explain AI and machine learning (ML) solutions while setting and maintaining high ethical standards Strong Hands-on experience on SQL queries, CICD Pipelines, Kubernetes etc.,
Posted 1 week ago
8.0 - 12.0 years
20 - 22 Lacs
Pune
Work from Office
Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.
Posted 1 month ago
8.0 - 12.0 years
20 - 22 Lacs
Bengaluru
Work from Office
Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.
Posted 1 month ago
8.0 - 12.0 years
13 - 23 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Must have skills : Amazon SageMaker Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : BE Share CV on - neha.mandal@mounttalent.com Summary: As an AI/ML Engineer, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve working with Amazon SageMaker, applying GenAI models, and ensuring production-ready quality with proper cloud or on-prem application pipeline. Roles & Responsibilities: - Design and develop machine learning models using Amazon SageMaker and other relevant tools. - Collaborate with cross-functional teams to integrate AI solutions into existing systems and applications. - Ensure production-ready quality of AI applications with proper cloud or on-prem application pipeline. - Apply GenAI models as part of the solution, including deep learning, neural networks, chatbots, and image processing. - Stay updated with the latest advancements in AI and machine learning technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Expertise in Amazon SageMaker. - Good To Have Skills: Experience with deep learning, neural networks, chatbots, and image processing. - Strong understanding of machine learning algorithms and techniques. - Experience with cloud-based AI services and tools. - Experience with production-ready quality assurance processes. - Solid grasp of programming languages such as Python, Java, or C++.
Posted 1 month ago
8.0 - 12.0 years
20 - 22 Lacs
Pune, Bengaluru, Delhi / NCR
Work from Office
Develop and deploy ML models using SageMaker. Automate data pipelines and training processes. Monitor and optimize model performance. Ensure model governance and reproducibility.
Posted 1 month ago
10.0 - 12.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Category Sales Job Details About Salesforce We're Salesforce, the Customer Company, inspiring the future of business with AI+ Data +CRM. Leading with our core values, we help companies across every industry blaze new trails and connect with customers in a whole new way. And, we empower you to be a Trailblazer, too - driving your performance and career growth, charting new paths, and improving the state of the world. If you believe in business as the greatest platform for change and in companies doing well and doing good - you've come to the right place. Overview of the Role: We have an outstanding opportunity for an expert AI and Data Cloud Solutions Engineer to work with our trailblazing customers in crafting ground-breaking customer engagement roadmaps demonstrating the Salesforce applications, platform across the machine learning and LLM/GPT domains in India! The successful applicant will have a track record in driving business outcomes through technology solutions, with experience in engaging at the C-level with Business and Technology groups. Responsibilities: Primary pre-sales technical authority for all aspects of AI usage within the Salesforce product portfolio - existing Einstein ML based capabilities and new (2023) generative AI Majority of time (60%+) will be customer/external facing Evangelisation of Salesforce AI capabilities Assessing customer requirements and use cases and aligning to these capabilities Solution proposals, working with Architects and wider Solution Engineer (SE) teams Preferred Qualifications: expertise in an AI related subject (ML, deep learning, NLP etc.) Familiar with technologies such as OpenAI, Google Vertex, Amazon Sagemaker, Snowflake, Databricks etc Required Qualifications: Experience will be evaluated based on the core proficiencies of the role. 4+ years working directly in the commercial technology space with AI products and solutions. Data knowledge - Data science, Data lakes and warehouses, ETL, ELT, data quality AI knowledge - application of algorithms and models to solve business problems (ML, LLMs, GPT) 10+ years working in a sales, pre-sales, consulting or related function in a commercial software company Strong focus and experience in pre-sales or implementation is required. Experience in demonstrating Customer engagement solution, understand and drive use cases, customer journeys, ability to draw Day in life of across different LOBs. Business Analysis/ Business case/return on investment construction. Demonstrable experience in presenting and communicating complex concepts to large audiences A broad understanding of and ability to articulate the benefits of CRM, Sales, Service and Marketing cloud offerings Strong verbal and written communications skills with a focus on needs analysis, positioning, business justification, and closing techniques. Continuous learning demeanor with a demonstrated history of self enablement and advancement in both technology and behavioural areas. Building reference models/ideas/approaches for inclusion of GPT based products within wider Salesforce solution architectures, especially involving Data Cloud Alignment with customer security and privacy teams on trust capabilities and values of our solution(s) Presenting at multiple customer events from single account sessions through to major strategic events (World Tour, Dreamforce) Representing Salesforce at other events (subject to PM approval) Sales and SE organisation education and enablement e.g. roadmap - all roles across all product areas Bridge/primary contact point to product management Provide thought leadership in how large enterprise organisation can drive customer success through digital transformation. Ability to uncover the challenges and issues a business is facing by running successful and targeted discovery sessions and workshops. Be an innovator who can build new solutions using out-of-the-box thinking. Demonstrate business value of our AI solutions to business using solution presentations, demonstrations and prototypes. Build roadmaps that clearly articulate how partners can implement and accept solutions to move from current to future state. Deliver functional and technical responses to RFPs/RFIs. Work as an excellent teammate by chipping in, learning and sharing new knowledge. Demonstrate a conceptual knowledge of how to integrate cloud applications to existing business applications and technology. Lead multiple customer engagements concurrently. Be self-motivated, flexible, and take initiative.| Accommodations If you require assistance due to a disability applying for open positions please submit a request via this . Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that's inclusive, and free from discrimination. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications - without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.
Posted 1 month ago
4.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Summary: We are looking for a skilled AI/ML Developer to join our team, responsible for integrating Amazon Connect with Amazon Lex and Salesforce . The ideal candidate will leverage Amazon Lex to understand and respond to customer queries (such as booking a rental showing), call Salesforce for data processing, and integrate the response back into the customer conversation seamlessly within Amazon Connect. This is a key role for automating and enhancing customer interactions using AI-driven technologies and cloud services. Key Responsibilities: Integration Development & Solution Design: Design and develop intelligent customer service workflows using Amazon Connect and Amazon Lex to handle customer requests, such as booking rental showings, by processing customer inputs (e.g., voice commands) and querying Salesforce for information. Build and maintain Amazon Connect Contact Flows that integrate seamlessly with Amazon Lex to automate customer interactions, such as understanding requests and responding in a human-like manner. Integrate Salesforce with Amazon Lex via AWS Lambda to retrieve relevant data, such as availability for rental showings or property details, and pass that information back to Amazon Lex for dynamic responses. Bot Development and AI Model Management: Leverage Amazon Lex to design and implement conversational AI models that can handle complex interactions, including NLP (Natural Language Processing) to understand customer intent. Ensure the Lex bot can query Salesforce for available data, using Lambda functions for API calls to Salesforce . Implement and optimize AI models to respond to customers in a human-like manner , ensuring natural conversation flow from the bot. AWS Lambda & Salesforce Integration: Develop AWS Lambda functions to facilitate communication between Amazon Lex and Salesforce , ensuring seamless data flow. For example, when a customer asks for details, Lex will query Salesforce for available data and then provide a confirmation to the user. Ensure that Lambda functions are designed to handle real-time API requests to Salesforce, extract relevant data , and return that data to Amazon Lex for appropriate responses. Qualifications: Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or a related field. Minimum of 5+ years of experience in AI/ML development with a strong focus on AWS technologies, especially Amazon Connect, Lambda, and Bedrock. Hands-on experience with AWS services like Amazon Lex, Amazon Polly, Amazon Transcribe, Amazon Rekognition, and Amazon SageMaker. Proven experience working with Amazon Bedrock for deploying machine learning models at scale. Strong knowledge of serverless architecture and AWS Lambda to process real-time events and data streams. Experience with NLP (Natural Language Processing) and speech-to-text technologies. Familiarity with AI/ML frameworks such as TensorFlow, PyTorch, or Scikit-learn. Knowledge of data pipelines, ETL processes, and data integration techniques for AI/ML applications. Strong experience with REST APIs, AWS SDKs, and integrating third-party services into the AWS ecosystem. Understanding of Amazon Connect architecture and contact flow customization. Proficiency in Python, JavaScript, or Node.js for implementing AI/ML models and Lambda functions. Preferred Skills: AWS Certified Cloud Practitioner (Mandatory) AWS Certified Developer Associate (Mandatory) AWS Certified Machine Learning - Specialty or similar AWS certifications (Good to have) AWS Certified Solutions Architect Associate (Good to have) Experience with AWS Glue, AWS Sagemaker, and Amazon Kendra for advanced AI/ML integrations. Experience with Chatbot development and Voice AI systems using Amazon Lex and Amazon polly. Strong problem-solving skills, with the ability to troubleshoot and optimize complex systems. Excellent communication and team collaboration skills, with the ability to present complex technical information clearly.
Posted 1 month ago
5.0 - 7.0 years
27 - 30 Lacs
Bengaluru
Work from Office
Principle Developer - ML/Prompt Engineer Technologies: Amazon Bedrock, RAG Models, Java, Python, C or C++, AWS Lambda, Responsibilities: Responsible for developing, deploying, and maintaining a Retrieval Augmented Generation (RAG) model in Amazon Bedrock, our cloud-based platform for building and scaling generative AI applications. Design and implement a RAG model that can generate natural language responses, commands, and actions based on user queries and context, using the Anthropic Claude model as the backbone. Integrate the RAG model with Amazon Bedrock, our platform that offers a choice of high-performing foundation models from leading AI companies and Amazon via a single API, along with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI. Optimize the RAG model for performance, scalability, and reliability, using best practices and robust engineering methodologies. Design, test, and optimize prompts to improve performance, accuracy, and alignment of large language models across diverse use cases. Develop and maintain reusable prompt templates, chains, and libraries to support scalable and consistent GenAI applications. Skills/Qualifications: Experience in programming with at least one software language, such as Java, Python, or C/C++. Experience in working with generative AI tools, models, and frameworks, such as Anthropic, OpenAI, Hugging Face, TensorFlow, PyTorch, or Jupyter. Experience in working with RAG models or similar architectures, such as RAG, Ragna, or Pinecone. Experience in working with Amazon Bedrock or similar platforms, such as AWS Lambda, Amazon SageMaker, or Amazon Comprehend. Ability to design, iterate, and optimize prompts for various LLM use cases (e.g., summarization, classification, translation, Q&A, and agent workflows). Deep understanding of prompt engineering techniques (zero-shot, few-shot, chain-of-thought, etc.) and their effect on model behavior. Familiarity with prompt evaluation strategies, including manual review, automatic metrics, and A/B testing frameworks. Experience building prompt libraries, reusable templates, and structured prompt workflows for scalable GenAI applications. Ability to debug and refine prompts to improve accuracy, safety, and alignment with business objectives. Awareness of prompt injection risks and experience implementing mitigation strategies. Familiarity with prompt tuning, parameter-efficient fine-tuning (PEFT), and prompt chaining methods. Familiarity with continuous deployment and DevOps tools preferred. Experience with Git preferred Experience working in agile/scrum environments Successful track record interfacing and communicating effectively across cross-functional teams. Good communication, analytical and presentation skills, problem-solving skills and learning attitude Mandatory Key Skills Amazon Bedrock,RAG Models,Java,Python,C++,AWS Lambda,RAG model,Anthropic, OpenAI,Hugging Face,TensorFlow,PyTorch,Jupyter,Amazon SageMaker, DevOps tools, Prompt Engineering*
Posted 2 months ago
7.0 - 14.0 years
7 - 13 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Role Responsibilities: Build and deploy ML models using Amazon SageMaker Integrate AI/ML solutions with cloud/on-prem pipelines Apply GenAI techniques like neural networks and image processing Ensure production-grade quality in AI solutions Job Requirements: Proven expertise in Amazon SageMaker and ML frameworks Strong programming skills in Python (Java/C++ a plus) Experience with GenAI tools and AI cloud services Understanding of model lifecycle, from training to deployment
Posted 2 months ago
11 - 20 years
20 - 30 Lacs
Jaipur
Work from Office
Responsibilities : 1. Strategic Leadership : a. Define and drive the overall ML Ops strategy and roadmap for the organization, aligning it with business objectives and technical capabilities. b. Oversee the design, development, and implementation of ML Ops platforms, frameworks, and processes. c. Foster a culture of innovation and continuous improvement within the ML Ops team. 2. Technical Architecture : a. Design and implement scalable, reliable, and efficient ML Ops architectures. b. Select and integrate appropriate tools, technologies, and frameworks to support the ML lifecycle. c. Ensure compliance with industry best practices and standards for ML Ops. 3. Team Management : a. Lead and mentor a team of ML Ops engineers and architects. b. Foster collaboration and knowledge sharing among team members. c. Provide technical guidance and support to data scientists and engineers. 4. Innovation and Research : a. Stay up-to-date with emerging ML Ops trends and technologies. b. Research and evaluate new tools and techniques to enhance ML Ops capabilities. c. Contribute to the development of innovative ML Ops solutions. Minimum Required Skills : - 11+ years of experience preferred. - Proven track record of designing and implementing large-scale ML pipelines and infrastructure. - Experience with distributed computing frameworks (Spark, Hadoop) - Knowledge of graph databases and auto ML libraries - Bachelor's / Master's degree in computer science, analytics, mathematics, statistics - Strong experience in Python, SQL. - Solid understanding and knowledge of containerization technologies (Docker, Kubernetes). - Proficient in Experience in CI/CD pipelines, model monitoring, and MLOps platforms (Kubeflow, MLFlow) - Proficiency in cloud platforms, containerization, and ML frameworks (TensorFlow, PyTorch). - Certifications in cloud platforms or ML technologies can be a plus. - Extensive experience with cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker, Kubernetes). - Strong problem-solving and analytical skills. - Ability to plan, execute and take ownership of task. Keywords : - ML Ops / MLOps Architect - Azure DevOps - Docker - Kubernetes - TensorFlow - MLFlow - Pipeline - Machine Learning Platform Engineer - Data Science Platform Engineer - DevOps Engineer (with ML focus) - AI Engineer - Data Engineer - Cloud Engineer (with ML focus) - Software Engineer (with ML focus) - Model Deployment Specialist - MLOps Architect - CI/CD - PyTorch - Scikit-learn - Cloud Computing - Big Data - Azure - Azure Machine Learning - GCP - Vertex AI - AWS - Amazon SageMaker
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough