Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
8 - 10 Lacs
chennai
Work from Office
Dear Candidate, We are hiring for Oracle EBS Technical Consultant, Chennai Location. Interested can share your profile to ponsubhasri.narasingam@doyensys.com Job Requirement Project Role: Expected to develop the PLSQL Code, RICE components from the scratch. Debug, Fix , enhance the existing process, provide technical solution for any requirement/issues, provide Usual Prod Support Project Role Description: Oracle Apps Technical Consultant Work Experience: 3- 5 years relevant Work location: Chennai Technical Expertise Must Have Skills: Hands on in Interface/Conversion experience Considerable experience in RDF, XML publisher development Good exposure to Key modules like AP/AR/GL/PO/OM/INV Strong experience in developing SQL and PL/SQL scripts. Good To Have Skills : Development life Cycle (Dev, UAT & PROD migration) experience Release Management, Prod Support process, Ticketing Tool & Documentation. Key Responsibilities: Same as Project Role Professional Attributes: Good Oral and Email Communication Educational Qualification: Not specific Behavioral Attribute: Good attitude, Discipline and Committed to work.
Posted 1 day ago
10.0 - 15.0 years
11 - 15 Lacs
bengaluru
Work from Office
What Youll Do Whether youre onsite or sharing your expertise via the cloud, youll deliver top-class support and inspire customer loyalty. As a HCM Technical Consultant, you will: Looking for a versatile Technical Consultant who is hard-working to work in Oracle Cloud Technologies independently with a fast-paced environment and align with Oracle methodologies and practices. Intermediate consulting position operating independently with some assistance and mentorship to provide quality work products to a project team or customer that align with Oracle methodologies and practices. Performs standard duties and tasks with some variation to implement Oracle products and technology to meet customer specifications. Standard assignments are accomplished without assistance by exercising independent judgment, within defined policies and processes, to deliver functional and technical solutions on moderately complex customer engagements Required Skills/Experience What Youll Bring You have that rare combinationa sharp technical brain and a head for business. Youll use this to help customers achieve real-world success with our products. We also look for: Comprehensive experience in analysis, design, testing and implementation of business systems involving Oracle HRMS Applications R12. Expert knowledge in Payroll Integration, Core HR integration, HRMS Data Conversions, custom development, customization, extension and personalization. Professional Experience with 8-10 Year in Oracle E-Business Suite- Oracle Core HR, Oracle Payroll, PMS, OLM, SSHR, OTL, PL/SQL, Fast Formula, Oracle Workflow, XML/RDF Reports and Oracle Interfaces, Absence Management, Talent Management Experienced on XML/RDF Reports and PL/SQL interface into HRMS System Payroll Processing, Prepayment, Costing & Transfer to GL. Programming experience in creating Procedures, Functions, Packages and others database objects using SQL and PL/SQL. Capacity to work as an Individual Contributor. Hands-On experience is a critical requirement. Superb communication skills written & verbal, mandatory. Good interpersonal skills with ability to build rapport with all collaborators. Ability to clearly articulate ideas and solutions in a clear & concise manner. Self-motivated with a lot of energy and drive. Should have the ability and willingness to learn. Should be good teammate and have good analytical skills
Posted 1 day ago
6.0 - 10.0 years
10 - 15 Lacs
bengaluru
Work from Office
What Youll Do Whether youre onsite or sharing your expertise via the cloud, youll deliver top-class support and inspire customer loyalty. As a HCM Technical Consultant, you will: Looking for a versatile Technical Consultant who is hard-working to work in Oracle Cloud Technologies independently with a fast-paced environment and align with Oracle methodologies and practices. Intermediate consulting position operating independently with some assistance and mentorship to provide quality work products to a project team or customer that align with Oracle methodologies and practices. Performs standard duties and tasks with some variation to implement Oracle products and technology to meet customer specifications. Standard assignments are accomplished without assistance by exercising independent judgment, within defined policies and processes, to deliver functional and technical solutions on moderately complex customer engagements Required Skills/Experience What Youll Bring You have that rare combinationa sharp technical brain and a head for business. Youll use this to help customers achieve real-world success with our products. We also look for: Comprehensive experience in analysis, design, testing and implementation of business systems involving Oracle HRMS Applications R12. Expert knowledge in Payroll Integration, Core HR integration, HRMS Data Conversions, custom development, customization, extension and personalization. Professional Experience with 8-10 Year in Oracle E-Business Suite- Oracle Core HR, Oracle Payroll, PMS, OLM, SSHR, OTL, PL/SQL, Fast Formula, Oracle Workflow, XML/RDF Reports and Oracle Interfaces, Absence Management, Talent Management Experienced on XML/RDF Reports and PL/SQL interface into HRMS System Payroll Processing, Prepayment, Costing & Transfer to GL. Programming experience in creating Procedures, Functions, Packages and others database objects using SQL and PL/SQL. Capacity to work as an Individual Contributor. Hands-On experience is a critical requirement. Superb communication skills written & verbal, mandatory. Good interpersonal skills with ability to build rapport with all collaborators. Ability to clearly articulate ideas and solutions in a clear & concise manner. Self-motivated with a lot of energy and drive. Should have the ability and willingness to learn. Should be good teammate and have good analytical skills.
Posted 1 day ago
4.0 - 8.0 years
8 - 17 Lacs
hyderabad
Work from Office
Job Title: Python Backend Developer Location: Hyderabad Employment Type: Full-time Job Summary We are seeking an experienced Python Backend Developer to design, develop, and maintain scalable backend services. The role involves building high-performance data pipelines, deploying applications on cloud infrastructure (AWS/EKS), integrating AI/Generative AI models, and working with GraphDB for advanced data querying and analytics. Key Responsibilities * Backend Development: Design, develop, and maintain Python-based backend services for real-time and batch data processing. * Cloud Infrastructure: Deploy and manage containerized applications on AWS ( primarily EKS ), ensuring scalability, high availability, and fault tolerance. * Data Processing: Build efficient and scalable workflows for large datasets, covering both batch and real-time processing. * AI/Generative AI Integration : Collaborate with data scientists and AI engineers to integrate AI/Generative AI models into backend pipelines. * GraphDB Expertise: Implement and query complex graph databases (RDF/SPARQL) to support semantic querying and advanced analytics. * Performance Optimization: Monitor, profile, and optimize backend systems for low-latency, high-throughput performance. * Code Quality & Documentation: Write clean, maintainable, and well-documented code; participate in code reviews and unit testing. * Collaboration: Work closely with front-end developers, data scientists, and cross-functional teams to deliver end-to-end solutions. Required Qualifications & Experience * Bachelors or Masters degree in Computer Science, Engineering, or related field. * 4+ years of experience in backend development with Python. * Strong hands-on experience with AWS (EKS, containerized apps). * Proven expertise in building and optimizing data pipelines (batch + real-time). * Knowledge of AI/Generative AI model integration. * Experience with GraphDB (RDF/SPARQL ) preferred. * Strong problem-solving, debugging, and optimization skills. Preferred Skills & Attributes * Familiarity with microservices architecture and CI/CD pipelines. * Excellent communication and collaboration skills. * Ability to thrive in a fast-paced, innovative environment.
Posted 6 days ago
5.0 - 10.0 years
9 - 13 Lacs
bengaluru
Work from Office
About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Experience:- Overall IT experience (No of years) - 9+- Data Modeling Experience - 5+- Data Vault Modeling Experience - 3+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Optimize data architecture for performance, scalability, and reliability.- Document models, data definitions, and metadata.- Support data governance, quality, and master data management initiatives.- Participate in code reviews, modeling workshops, and agile ceremonies (if applicable). Technical ExperienceGood to Have Skills: - 9+ year overall IT experience, 5+ years in Data Modeling and 3+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Hands-on experience in any Data Vault automation tool (e.g., Vault Speed, WhereScape, biGENIUS-X, dbt, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google Big Query).- Must be familiar with Data Architecture Principles.- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education
Posted 1 week ago
2.0 - 5.0 years
9 - 13 Lacs
bengaluru
Work from Office
About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:- Knowledge Modeling & Ontology Design- Knowledge in any of the domain-specific standards - Develop ontologies and taxonomies using any of the open-source knowledge graph tools and build sophisticated knowledge graphs for structured data representation that aligns with domain standards.- Build and refine knowledge graphs for structured data representation, aligning with domain standards - Design scalable architectures for linked data and semantic models, integrating data from multiple sources.- Design metadata schemas and common data vocabulary, leveraging RDF/OWL to enhance data accessibility.- Use tools like Protg or TopBraid Composer or any other tool to define and manage ontology structures.- Develop data models that support semantic search, data extraction, and AI-driven recommendations. Technical Experience:Must Have Skills: - Minimum of 2 years in ontology development, knowledge modeling, and graph database management.- Proficiency in RDF, OWL, SKOS or SHACL - Proficiency in Protg or TopBraid Composer or any other modelling tool- Familiarity with SPARQL or other graph query languages- Knowledge in any of the knowledge graph platforms like Neo4j, Dgraph, StarDog, TopBriad ,ArangoDB, and Blazegraph.- Ability to incorporate domain knowledge into semantic models for actionable business insights.Good to Have Skills: - Collaborate with AI/ML teams to implement natural language processing and contextual data retrieval using knowledge graphs.- Enhance data discovery and search capabilities through graph-based search relevancy and knowledge representation.- Experience with knowledge graph visualization tools like Graphistry and Gephi.- Experience in programming skills in Python or Java to implement custom graph applications and integrations.- Integrate graph database solutions for efficient data querying and management.- Familiarity with open-source graph databases and their applications in real-time analytics.Professional Experience:- Good communication skills for conveying complex concepts to technical and non-technical stakeholders.- Ability to work both independently and within a team setting, showing leadership in best practices.- Proactive, innovative, and detail-oriented, with a strong focus on emerging technologies. Educational Qualification:- Bachelors or masters degree in information science, Data Science, Knowledge Management, or a related field.- Certifications in ontology management or data standards are highly valued. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
9 - 13 Lacs
bengaluru
Work from Office
About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
9 - 13 Lacs
bengaluru
Work from Office
About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education
Posted 1 week ago
8.0 - 13.0 years
12 - 16 Lacs
hyderabad
Work from Office
Job Description Summary The person in this role will be the technical team lead and the point of contact between the PM, Architect and People leader. This person will work closely with the Product Owner to break down features into detailed, technical, work chunks that will be implemented by the team members. This person will oversee the detailed technical designs of the individual features. This person will need to fully understand the Modeling ecosystem and where it fits in the GridOS context Job Description Roles and Responsibilities Serve as technical lead for the Modeling Development team: Single point of contact about technical development aspects for the Architect, PO, Scrum Master and Team Manager, owns onboarding and ramp-up processes for the team members, owns efficiency and quality of the development process. Responsible for the quality of the development in terms of software performances, code quality, test automation, code coverage, CI/CD and documentation. Oversee the detailed technical designs of the individual features. High level estimates of the different features of the products. Owns technical deliverables during the entire lifecycle of the products. Keep the products development lifecycle on track in terms of budget, time and quality. Keep track of developments happening within GridOS ecosystem and build bridges with other engineering and services teams. Interact with Services teams, and partner integrator teams, to provide processes to ensure best use of GridOS Modeling products and services. Effectively communicate both verbally and in writing with peers and team members as an inclusive team member. Serves as a technical leader or mentor on complex, integrated implementations within the GridOS Modeling product teams. Work in a self-directed fashion to proactively identify system problems, failures, and areas for improvement. Track issue resolution and document solutions implemented and create troubleshooting guides. Peer review of Pull Requests. Education Qualification For roles outside USA: Bachelor's Degree in Computer Science or STEM Majors (Science, Technology, Engineering and Math) with significant experience. For roles in USA: Bachelor's Degree in Computer Science or STEM Majors (Science, Technology, Engineering and Math) Desired Characteristics Technical Expertise: Strong understanding of OOP concepts Strong experience with Kubernetes and microservices architectures Containers technology Strong expertise in JAVA and Python, Maven and Springboot framework REST API (OpenAPI) and event design GraphQL schemas & services design Graph technologies and frameworks: Apache Jena Neo4J GraphDB Experience with RDF and SPARQL Unit and integration tests design CI/CD pipelines designs JSON & YAML Schemas Events driven architecture Data streaming technologies such as Apache Kafka Microservice observability and metrics Integration skills Autonomous and able to work asynchronously (due to time zone difference) Software & API documentation Good to have Data engineering and data architecture expertise Apache Camel & Apache Arrow Experience in Grid or Energy software business (AEMS ADMS Energy Markets SCADA GIS) Business Acumen: Adept at navigating the organizational matrix; understanding people's roles, can foresee obstacles, identify workarounds, leverage resources and rally teammates. Understand how internal and/or external business model works and facilitate active customer engagement Able to articulate the value of what is most important to the business/customer to achieve outcomes Able to produce functional area information in sufficient detail for cross-functional teams to utilize, using presentation and storytelling concepts. Possess extensive knowledge of full solution catalog within a business unit and proficiency in discussing each area at an advanced level. Six Sigma Green Belt Certification or equivalent quality certification. Leadership: Demonstrated working knowledge of internal organization Foresee obstacles, identify workarounds, leverage resources, rally teammates. Demonstrated ability to work with and/or lead blended teams, including 3rd party partners and customer personnel. Demonstrated Change ManagementAcceleration capabilities Strong interpersonal skills, including creativity and curiosity with ability to effectively communicate and influence across all organizational levels Proven analytical and problem resolution skills Ability to influence and build consensus with other Information Technology (IT) teams and leadership.
Posted 1 week ago
2.0 - 5.0 years
9 - 13 Lacs
bengaluru
Work from Office
Role Description We are seeking a Data Science Engineer to contribute to the development of intelligent, autonomous AI systems The ideal candidate will have a strong background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves deploying AI solutions that leverage Retrieval-Augmented Generation (RAG) Multi-agent frameworks Hybrid search techniques to enhance enterprise applications. Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG Pipelines: Integrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER Models: Train OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge Graphs: Construct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 4+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce.
Posted 1 week ago
2.0 - 7.0 years
15 - 19 Lacs
pune
Work from Office
Project description You'll be working in the Technology Services GCTO EA Team. Technology Services is a group-wide function which provides integrated and secure infrastructure services for company by offering best fit, easy to leverage, reliable and cost-effective technology services and strategic products which provide functionality, strategic insights and expertise across all business groups globally. As a Software engineer focused on Microsoft 365 & Azure Collaboration Solutions, you will join an existing team in the EA Team in Group CTO supporting our strategic initiatives across the bank. Responsibilities Help Group Enterprise Architecture team to develop our Azure based applications and tooling Design and develop custom collaboration tools and integrations using Microsoft 365 services (e.g., SharePoint, Teams, Outlook, OneDrive, Power Platform). Build and deploy cloud-native applications using Azure services such as Azure Functions, Logic Apps, Azure AD, and Azure DevOps. Create and maintain APIs and connectors to integrate Microsoft 365 with third-party systems. Collaborate with cross-functional teams including product managers, UX designers, and QA engineers to deliver high-quality solutions. Monitor system performance and troubleshoot issues to ensure optimal uptime and availability. Collaborate with development teams to optimize application performance and reliability. Produce clear and commented code Produce clear and comprehensive documentation Play an active role with technology support teams and ensure deliverables are completed or escalated on time Provide support on any related presentations, communications, and trainings Skills Must have Azure experience Hands on experience working with Kubernetes, Docker and other infrastructure based technologies. Experience with developing collaboration tooling Power Automate knowledge & integration with M365 suite Has strong communication skills 2+ years of experience in a Scrum delivery model In depth experience of Git, JIRA, GitLab Excellent end-to-end SDLC process understanding. Proven track record of delivering complex web apps on tight timelines Understanding of fundamental design principles behind a scalable application and familiarity with RESTful Services. Fluent in English both written and spoken. Passionate about development with focus on web architecture and design Analytical and logical A team player, comfortable working with a lead developer and architects for the program An excellent communicator who is adept in, handling ambiguity and communicating with both technical and non-technical audiences Comfortable with working in cross-functional global teams to effect change Nice to have 2+ years JavaScript, TypeScript experience PowerBI knowledge Knowledge of performance testing frameworks including Fortify, SonarQube, Mocha and Jest Knowledge and experience of Data Science and common tools Knowledge of RDF based Graph Database solutions e.g. GraphDB, Anzo Programming in Python and familiarity with Machine Learning algorithms Familiarity with data retrieval with SQL and Oracle
Posted 1 week ago
3.0 - 7.0 years
8 - 13 Lacs
bengaluru
Work from Office
Role Description About Our Engineers Our Engineers work on a diverse range of solutions using cutting-edge technology every day, including our award-winning platforms like Autobahn, Fabric, Glue, and more. Our Technology strategy is designed to build new revenue streams and develop innovative ideas which produce a new competitive advantage for the Bank, whilst also fixing our foundations and focusing on the importance of stability and risk management. Corporate Bank Technology understands the clients needs and has a robust strategy and innovative approach to deliver. About the Team Cash Management Payment Orchestration : Cash Management Payment Orchestration has an end-to-end responsibility for application development and management of the respective application portfolio. The portfolio covers strategic payment processing build out and Core Products that Corporate Bank offers to its international clients like DDA/Cash Accounts, Core Banking, Payments Processing and Clearing globally. It is also the global cash settlement platform for all other business lines Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. Your key responsibilities What Youll Do As part of our global team you will work on various components as a Software Engineer. The Engineer will be responsible for the DB aspects of the Technical Application framework that supports our HPE NonStop-based application db-Internet. We are building an excellent Technical Team to support our critical application, enhancing its current capabilities, and looking to create opportunities beyond this to progress into the more modern aspects of our application. Product update and support of Automation and Monitoring tools such as Reflex and Multibatch. Enhance monitoring capability to cover more aspects of our applications. Ongoing evolution of our Disaster Recovery strategy, planning and supporting tools. Support and development of our Automated Test System Build process. TACL coding and testing of routines to support our application. Upgrade activities such as MQ Series, Operating System, and Hardware Upgrades. Performance and capacity managements aspects of our application. Understanding of Network segregation and firewalling (ACR) . General TCP/IP configuration and encryption. Update and adherence to db-Internet Security Controls. Collaborate with teams and individuals across the applications to accomplish common goals. Work with the team on non-functional requirements, technical analysis and design. Your skills and experience Skills Youll Need Good level of experience in the Technical Management of HPE NonStop and/or application Atlas Global Banking/db-Internet. Good working knowledge of HPE NonStop Products and Utilities such as FUP, SQL, ENFORM, TACL, TMF/RDF, SCF and Safeguard. Good working knowledge of OSS and Utilities and directory structures including an understanding of our internal middleware called Ibus Bridge, its configuration and setup. Any knowledge of Java would be advantageous for the future. Proven ability to effectively assess and mitigate project risks and dependencies. Experienced in effectively communicating with and positively influencing project stakeholders and team members.
Posted 1 week ago
7.0 - 12.0 years
32 - 37 Lacs
bengaluru
Work from Office
Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG Pipelines: Integrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER Models: Train OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge Graphs: Construct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 13+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for cross-functional collaboration. Ability to work independently and manage multiple projects simultaneously.
Posted 1 week ago
2.0 - 6.0 years
9 - 13 Lacs
pune
Work from Office
Youll make a difference by: Having Expertise in Strong proficiency in Java and Python. Having Proven experience working with RDF graphs, writing and optimizing SPARQL queries, and developing ontologies using OWL and SHACL. Having Solid understanding and practical experience with RDF reasoning, including rule-based inference, consistency checks, and the use of OWL reasoners. Having Demonstrated experience in designing and implementing robust RESTful APIs and interfaces. Having Strong foundation in software engineering best practices, including Git version control, clean code principles, unit testing, and active participation in code reviews. Having Proficiency in data modeling, particularly with UML class diagrams, and a strong eagerness to learn and apply OWL for ontology modeling. Having Excellent abstract thinking skills, with the ability to translate complex requirements into effective data models and semantic solutions. Having Ability to acquire and apply domain expertise, particularly in modeling templates for systems and equipment. Youll win us over by: Holding a graduate BE / B.Tech / MCA/M.Tech/M.Sc with good academic record. 2 - 6 years of demonstrable experience in Java and Python Development. Familiarity with semantic web frameworks and libraries such as Apache Jena and rdflib. Hands-on experience with graph databases, specifically GraphDB. Knowledge of SHACL rules, performance tuning of shapes, and advanced reasoning techniques. Experience with Linked Data principles and formats, including JSON-LD creation and parsing.This role, based in Pune, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future.
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You will be joining SAP, a company that is focused on enabling you to bring out your best and help the world run better. The company culture emphasizes collaboration and a shared passion for creating a workplace that values differences, embraces flexibility, and is aligned with purpose-driven and future-focused work. At SAP, you will experience a highly collaborative and caring team environment that prioritizes learning and development, acknowledges individual contributions, and provides a variety of benefit options for you to choose from. As a Knowledge Engineer (f/m/d) for Enterprise Knowledge Graphs at SAP, you will have the opportunity to contribute to the development of Knowledge Graphs as a source of explicit knowledge across multiple SAP domains. Your role will involve supporting the integration of Knowledge Graphs in various tasks such as Generative AI applications, designing and building large-scale Knowledge Graphs using business data, and collaborating with Knowledge and Data engineering teams and stakeholders to meet requirements. To excel in this role, you should have a Bachelors or Masters degree in computer science, artificial intelligence, physics, mathematics, or related disciplines. Professional experience in Knowledge Graphs and their application in a business context would be advantageous. Knowledge of RDF Knowledge Graph technology stack, semantic/knowledge modeling, and experience with Knowledge Graph databases are desirable. Additionally, familiarity with latest trends in Knowledge Graphs, data science knowledge, Python proficiency, and strong communication and collaboration skills are essential for this role. The AI organization at SAP is dedicated to seamlessly infusing AI into all enterprise applications, allowing customers, partners, and developers to enhance business processes and generate significant business value. By joining the international AI team at SAP, you will be part of an innovative environment with ample opportunities for personal development and global collaboration. At SAP, inclusivity, health, well-being, and flexible working models are emphasized to ensure that every individual, regardless of background, feels included and can perform at their best. The company values diversity and unique capabilities, investing in employees to inspire confidence and help them realize their full potential. SAP is an equal opportunity workplace and an affirmative action employer committed to creating a better and more equitable world. If you are interested in applying for employment at SAP and require accommodation or special assistance, please reach out to the Recruiting Operations Team at Careers@sap.com. SAP employees can also explore roles eligible for the SAP Employee Referral Program under specific conditions outlined in the SAP Referral Policy. Background verification with an external vendor may be required for successful candidates. Join SAP, where you can bring out your best and contribute to innovations that help customers worldwide work more efficiently and effectively, ensuring challenges receive the solutions they deserve.,
Posted 2 weeks ago
16.0 - 20.0 years
0 Lacs
karnataka
On-site
Are you ready to help shape the future of healthcare Join GSK, a global biopharma company with a special purpose to unite science, technology, and talent to get ahead of disease together. GSK aims to positively impact the health of billions of people and deliver stronger, more sustainable shareholder returns. As an organization where people can thrive, GSK is committed to preventing and treating diseases on a global scale. By joining GSK at this exciting moment, you can contribute to the mission of getting Ahead Together. As the Principal Data Engineer at GSK, you will play a crucial role in transforming the commercial manufacturing and supply chain organization. Your responsibilities will include increasing capacity and speed for transferring new products from the R&D organization. Data and AI are essential components in achieving these goals, ultimately helping to launch medicines quicker and have a positive impact on patients. The primary purpose of your role is to take technical accountability for the CMC Knowledge Graph. You will drive forward its design and implementation by providing technical direction and oversight to the development team. Additionally, you will collaborate with Product Management, business representatives, and other Tech & Data experts to ensure that the CMC Knowledge Graph meets the business requirements. Your role will involve supporting the CMC Knowledge System Director, product managers, business leaders, and stakeholders in identifying opportunities where Knowledge Graph and other Data & AI capabilities can transform GSK's CMC and New Product Introduction processes. You will provide technical leadership for other Data & AI products in the CMC/NPI portfolio. Your immediate priority will be to lead the technical work required to transition an existing proof-of-concept CMC Knowledge Graph and its associated analytics use-cases into a full-fledged, sustainable Data & AI product. This will involve leading the technical design, development, testing, and release of the CMC Knowledge Graph and other Data & AI solutions in the CMC/NPI portfolio. To succeed in this role, you should have a proven track record in delivering complex data engineering projects in a cloud environment, preferably Azure, with a total of 16+ years of experience. Strong technical expertise in designing, developing, and supporting Knowledge Graphs is essential, along with proficiency in working with graph technologies such as RDF, OWL, SPARQL, and Cypher. Experience in leading and managing technical teams, data modeling/ontologies, data integration, data transformation techniques, programming skills, and familiarity with DevOps principles and CI/CD practices are also required. If you possess an understanding of pharmaceutical industry data, domain knowledge within CMC, and knowledge of GxP compliance requirements, it would be a plus. By joining GSK, you will be part of a global biopharma company that is dedicated to uniting science, technology, and talent to get ahead of disease together. GSK focuses on preventing and treating diseases with vaccines, specialty and general medicines, and invests in core therapeutic areas such as infectious diseases, HIV, respiratory/immunology, and oncology. If you are looking for a place where you can be inspired, encouraged, and challenged to be the best you can be, join GSK on this journey to get Ahead Together.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a highly skilled Data Scientist with expertise in AI agents, generative AI, and knowledge engineering, tasked with enhancing AI-driven cloud governance solutions. Your main focus will be on advancing multi-agent systems, leveraging LLMs, and integrating knowledge graphs (OWL ontologies) in a Python environment. Working at the intersection of machine learning, AI-driven automation, and cloud governance, you will be responsible for designing intelligent agents that adapt dynamically to cloud ecosystems. Your contributions will play a crucial role in FinOps, SecOps, CloudOps, and DevOps by providing scalable, AI-enhanced decision-making, workflows, and monitoring. Your key responsibilities will include designing, developing, and optimizing LLM-based multi-agent systems for cloud governance, implementing agent collaboration using frameworks like LangChain, AutoGen, or open-source MAS architectures, and developing adaptive AI workflows to enhance governance, compliance, and cost optimization. You will also be applying generative AI techniques such as GPT-4, Google Gemini, and fine-tuned BERT models to knowledge representation and reasoning, designing and managing knowledge graphs, OWL ontologies, and SPARQL queries for intelligent decision-making, and enhancing AI agent knowledge retrieval using symbolic reasoning and semantic search. Additionally, you will be developing embedding-based search models for retrieving and classifying cloud governance documents, fine-tuning BERT, OpenAI embeddings, or custom transformer models for document classification and recommendation, and integrating discrete event simulation (DES) or digital twins for adaptive cloud governance modeling. In the realm of cloud governance and automation, your tasks will involve working with multi-cloud environments (AWS, Azure, GCP, OCI) to extract, analyze, and manage structured/unstructured cloud data, implementing AI-driven policy recommendations for FinOps, SecOps, and DevOps workflows, and collaborating with CloudOps engineers and domain experts to enhance AI-driven automation and monitoring. To qualify for this role, you need to have at least 4+ years of experience in Data Science, AI, or Knowledge Engineering, strong proficiency in Python and relevant ML/AI libraries (PyTorch, TensorFlow, scikit-learn), hands-on experience with knowledge graphs, OWL ontologies, RDF, and SPARQL, expertise in LLMs, NLP, and embedding-based retrieval, familiarity with multi-agent systems, LangChain, AutoGen, and experience working with cloud platforms (AWS, Azure, GCP) and AI-driven cloud governance. Preferred qualifications include experience with knowledge-driven AI applications in cloud governance, FinOps, or SecOps, understanding of semantic search, symbolic AI, or rule-based reasoning, familiarity with event-driven architectures, digital twins, or discrete event simulation (DES), and a background in MLOps, AI pipelines, and cloud-native ML deployments. In return, you will have the opportunity to work on cutting-edge AI agent ecosystems for cloud governance in a collaborative environment that brings together AI, knowledge engineering, and cloud automation. Competitive compensation, benefits, and flexible work arrangements (remote/hybrid) are also part of the package. If you thrive in a fast-paced environment, demonstrate intellectual curiosity, and have a passion for applying advanced AI techniques to solve real-world cybersecurity challenges, this role is for you.,
Posted 2 weeks ago
3.0 - 8.0 years
13 - 17 Lacs
gurugram
Work from Office
Position Summary: We are seeking a highly skilled Factory Digital Twin Developer and Factory Data Scientist to join our dynamic development team. In this role, you will drive the development and implementation of Factory Digital Twins, supporting the Digital Governance and Excellence Team. You will be responsible for bringing value out of available data sources by deploying data analytics methods, models, algorithms, and visualizations. You will offer and transfer those solutions to the manufacturing locations with the goal of extracting business relevant insights as a foundation for management reporting, transparency and decision making for balanced growth. As a successful candidate you will demonstrate the ability to drive complex, multi-functional strategic initiatives that impact on the planning, organization, and implementation of our product portfolio to the market worldwide. A Snapshot of your Day How Youll Make an Impact (responsibilities of role) Development and maintenance of algorithms, modules and libraries in Factory Digital Twin Core, based on a structured elicitation of requirements from the factories and manufacturing network. Understanding, modelling and deployment of behavioral logic, constraints as well interdependencies in the domain of the factory shopfloor. Analysis of factory data related to the three domains Products, Process and Resources (PPR) and development of ETL (Extraction, Transformation and Loading) Pipelines to generate a standardized input for Factory Digital Twin Simulation. Conducting simulation analytics, derivation of recommendations and measures as well as optimization of brownfield or greenfield factories and the manufacturing network. Development of AI algorithms for analytics, explanation, generation of improvement scenarios, and optimization of factories and manufacturing network toward defined situation and business goal functions. Development of structures for data and knowledge like glossaries, taxonomies and ontologies, knowledge graphs as well as modelling of logical constraints and rules. What You Bring (required qualification and skill sets) Successful university degree in Engineering, Automation, Computer Science, Mathematics, Physics, or similar. Profound understanding and experience in factory domains, production, product structures and manufacturing processes and technologies, especially in factory planning. Working knowledge in modelling and Discrete Event Simulation (DES) as well as description and evaluation of scenarios with Tecnomatix Plant Simulation, and knowledge of development and usage of 3D models in CAD-environment (e.g. NX, AutoCAD) is beneficial. Experience in development of ETL-Pipelines and development of dashboards for data analytics and visualization, as well as in programming (e.g. Python, C++) and development of databases, ontologies and queries (RDF, SPARQL, SHACL, SQL). Experience in development and application of AI-Algorithms from analytics up to optimization. Proficiency in English and readiness to travel globally is preferable.
Posted 2 weeks ago
15.0 - 20.0 years
10 - 20 Lacs
bengaluru
Work from Office
Project Role : Integration Architect Project Role Description : Architect an end-to-end integration solution. Drive client discussions to define the integration requirements and translate the business requirements to the technology solution. Activities include mapping business processes to support applications, defining the data entities, selecting integration technology components and patterns, and designing the integration architecture. Must have skills : AI Agents & Workflow Integration Minimum 18 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a C-suite - facing Industrial AI & Agentic Systems Lead to architect, govern, and scale AI solutions - including AI, multi-agent, LLM-driven, tool-using autonomous systems - across manufacturing, supply chain, and plant operations. You will define the strategy-to-scale journey from high-value use case selection (OEE, yield, PdM, energy, scheduling, autonomous quality) to edge - cloud architectures, MLOps/LLMOps, Responsible & Safe AIAgentic AI, and IT/OT convergence, delivering hard business outcomes. Roles & Responsibilities:1.Strategy & C-Suite Advisory:-Define an Industrial AI + Agentic AI strategy & roadmap tied to OEE, yield, cost, throughput, energy, and sustainability KPI - with ROI/payback models.-Shape operating models (central CoE vs. federated), governance, funding, and product-platform scaling approaches.-Educate CxO stakeholders on where Agentic AI adds leverage (closed-loop optimization, autonomous workflows, human-in-the-loop decisioning).2.Architecture & Platforms:- Design edge- plant - cloud reference architectures for ML + Agentic AI:data ingestion (OPC UA, MQTT, Kafka), vector DB/RAG layers, model registries, policy engines, observability, and safe tool execution- Define LLMOps patterns for prompt/version management, agent planning/execution traces, tool catalogs, guardrails, and evaluation harnesses.3.Agentic AI (Dedicated):- Architect multi-agent systems (planner- solver - critic patterns) for:SOP generation & validation, root-cause analysis & corrective action recommendation, autonomous scheduling & rescheduling, MRO/work order intelligence, control room copilots orchestrating OT/IT tools.- Design tooling & action interfaces (function calling tools registry) to safely let agents interact with MES/ERP/CMMS/SCADA/DCS, simulations (DES, digital twins), and optimization solvers (cuOpt, Gurobi, CP-SAT).- Establish policy, safety, and constraints frameworks (role-based agent scopes, allow/deny tool lists, human-in-the-loop gates, audit trails).-Implement RAG + knowledge graph + vector DB stacks for engineering/service manuals, logs, SOPs, and quality records to power grounded agent reasoning.-Set up evaluation & red-teaming for agent behaviors:hallucination tests, unsafe action prevention, KPI-driven performance scoring.4.Use Cases & Solutions (Manufacturing Focus):- Computer Vision Autonomous Quality (TAO, Triton, TensorRT) with agentic triage & escalation to quality engineers.- Predictive/Prescriptive Maintenance with agents orchestrating data retrieval, work order creation, spare part planning.- Process & Yield Optimization where agents run DOE, query historians, simulate scenarios (digital twins), recommend set-point changes.- Scheduling Throughput Optimization with planner - optimizer agents calling OR/RL solvers.- GenAI/LLM for Manufacturing:copilots & autonomous agents for SOPs, RCA documentation, PLC/SCADA code refactoring (with strict guardrails).5.MLOps, LLMOps, Edge AI & Runtime Ops:- Stand up MLOps + LLMOps:CI/CD for models & prompts, drift detection, lineage, experiment & agent run tracking, safe rollback.- Architect Edge AI on NVIDIA Jetson/IGX, x86 GPU, Intel iGPU/OpenVINO, ensuring deterministic latency, TSN/real-time where needed.- Implement observability for agents (traces, actions, rewards/scores, SLA adherence).6.Responsible Safe AI, Compliance & Security:- Codify Responsible AI and Agentic Safety policies:transparency, explainability (XAI), auditability, IP protection, privacy, toxicity & jailbreak prevention.- Align with regulations (e.g., GxP, FDA 21 CFR Part 11, ISO 27001, IEC 62443, ISO 26262, AS9100) for industrial domains.7.Delivery, GTM & Thought Leadership:- Serve as chief architect design authority on large AI + Agentic programs; mentor architects, data scientists/engineers, and MLOps/LLMOps teams.- Lead pre-sales, solution shaping, executive storytelling, and ecosystem partnership building (NVIDIA, hyperscalers, MES/SCADA, optimization, cybersecurity). Professional & Technical Skills: Must have Skills:- Proven AI at scale delivery record in manufacturing with quantified value and hands-on leadership of LLM/Agentic AI initiatives.- Deep understanding of shop-floor tech (MES/MOM, SCADA/DCS, historians- PI/AVEVA, PLC/RTUs), protocols (OPC UA, MQTT, Modbus, Kafka).- Expertise in ML & CV stacks (PyTorch/TensorFlow, Triton, TensorRT, TAO Toolkit) and LLM/Agentic stacks (function calling, RAG, vector DBs, prompt/agent orchestration).- MLOps & LLMOps (MLflow, Kubeflow, SageMaker/Vertex, Databricks, Feast, LangSmith/Evaluation frameworks, guardrails).- Edge AI deployment on NVIDIA/Intel/x86 GPUs, with K8s/K3s, Docker, Triton Inference Server.- Strong security & governance for IT/OT and AI/LLM (IEC 62443, Zero Trust, data residency, key/token vaults, prompt security).- Executive communication:convert complex AI+Agentic architectures into board-level impact narratives Good to have skills:- Agentic frameworks:LangGraph, AutoGen, CrewAI, Semantic Kernel, Guardrails, LMQL.- Optimization & RL:cuOpt, Gurobi, OR-Tools, RLlib, Stable Baselines.- Digital Twins & Simulation:NVIDIA Omniverse/Isaac/Modulus, AnyLogic, AspenTech, Siemens.- Knowledge graphs & semantics:Neo4j, RDF/OWL, SPARQL, ontologies for manufacturing.- Standards & frameworks:ISA-95, RAMI 4.0, MIMOSA, ISO 8000, DAMA-DMBOK.- Experience in regulated sectors (Pharma/MedTech, Aero/Defense, Automotive).- AI/ML/LLM:PyTorch, TensorFlow, ONNX, Triton, TensorRT, TAO Toolkit, RAPIDS,LangChain/LangGraph, AutoGen, Semantic Kernel, Guardrails, OpenVINO.- MLOps/LLMOps/DataOps:MLflow, Kubeflow, SageMaker, Vertex AI, Databricks, Feast, Airflow/Prefect, Great Expectations, LangSmith, PromptLayer.- Edge/OT:NVIDIA Jetson/IGX, K3s/K8s, Docker, OPC UA, MQTT, Ignition, PI/AVEVA, ThingWorx.- Data/Streaming/RAG:Kafka, Flink/Spark, Delta/Iceberg/Hudi, Snowflake/BigQuery/Synapse- Vector DBs (Milvus, FAISS, Qdrant, Weaviate), KG (Neo4j).- Cloud :AWS/Azure/GCP(at least one at expert level), Kubernetes, Security (CISSP/IEC 62443) a plus.- Lean/Six Sigma/TPM nice to have credibility with operations.- Leadership & Behavioral Competencies:C-suite advisor & storyteller with outcome-first mindset.- Architectural authority balancing speed, safety, and scale.- People build across DS/ML, DE, MLOps/LLMOps, and OT.- Change leader who can operationalize AI & agents on real shop floors. Additional Info:- A minimum of 20 years of progressive information technology experience is required.- A Bachelors/master's in engineering CS Data Science (PhD preferred for R&D-heavy roles) is required.- This position is based at Bengaluru location. Qualification 15 years full time education
Posted 2 weeks ago
1.0 - 5.0 years
6 - 10 Lacs
bengaluru
Work from Office
We are seeking a highly skilled Ontology Expert & Knowledge Graph Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring accurate representation and integration of complex data sets. You will leverage industry best practices to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs that drive data-driven decision-making and innovation within the company. Job Purpose: The role of Ontology & Knowledge Graph / Data Engineer is to design, develop, implement, and maintain enterprise ontologies in support of Organizations Data Driven Digitalization strategy. This role combines architecture ownership with hands-on engineering: you will model ontologies, stand up graph infrastructure, build semantic pipelines, and expose graph services that power search, recommendations, analytics, and GenAI solutions for our organization. Seeking highly skilled motivated expertise to drive the development and shape the future of enterprise AI by designing and implementing large-scale ontologies and knowledge graph solutions. Youll work closely with internal engineering and AI teams to build scalable data models that enable advanced reasoning, semantic search, and agentic AI workflows. Key Responsibilities: 1.Ontology Development: Design and apply ontology principles to improve semantic reasoning and data integration, ensuring alignment with business requirements and industry standards. Collaborate with domain experts, product managers and customers to capture and formalize domain knowledge into ontological structures and vocabularies & improve data discoverability. Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Integrate Semantic Data Models with existing data infrastructure and applications 2.Knowledge Graph Implementation & Data Integration: Design and build knowledge graphs based on ontologies. Create\Build Knowledge Graph based on the data from multiple sources while ensuring data integrity and data consistency. Collaborate with data engineers for data ingestion and ensure smooth integration of data from multiple sources Administer and maintain graph database solutions, including both Semantic and Property Graphs Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. 3.Data Quality and Governance: Ensure the quality, accuracy, and consistency of ontologies, and knowledge graphs. Define and implement data governance processes and standards for ontology development and maintenance. 4.Collaboration And Communication: Collaborate with internal engineering teams to align data architecture with Gen AI capabilities Leverage on AI techniques by aligning knowledge models with RAG pipelines and agent orchestration Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. Research and Innovation: Stay up to date with the latest advancements in the field of NLP, LLM and machine learning and proactively identify opportunities to leverage new technologies for improved solutions. Experience: 46 years of industrial experience in AI [OR] Data Science [OR] Data Engineering. 23 years of hands-on experience building ontologies and knowledge systems. Proficiency with graph databases such as Neo4j, GraphDB [RDF based]. Understanding of semantic standards like OWL, RDF, W3C and property graph approaches. Familiarity with Gen AI concepts including retrieval-augmented generation and agent-based AI. Required Knowledge/Skills, Education, and Experience: Bachelors or masters degree in computer science, Data Science, Artificial Intelligence, or a related field, or a specialization in natural language processing is preferred. Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. Proficiency in Python and other programming languages used for data engineering. Experience with NLP and GEN AI based Frameworks [Langchain, Langgraph] Good working project experience in cloud computing i.e., AWS/ Azure/GCP cloud Services including VPCs, EBS, ALBs, NLBs, EC2, S3, and so on so forth.
Posted 3 weeks ago
4.0 - 9.0 years
11 - 14 Lacs
bengaluru
Work from Office
SUMMARY Job Role: Oracle EBS with Python Specialist Location: Pune/Hyderabad/Bangalore Experience: 4 years of relevant experience in Oracle EBS with Python scripting Job Description: Develop and manage Python scripts for ETL (Extract, Transform, Load) processes from Oracle EBS modules Automate repetitive tasks in Oracle EBS using Python frameworks such as Pandas and NumPy Create RESTful APIs using Fast API or Flask to connect Oracle EBS with external systems Utilize MongoDB, SQL-based databases, SAS, Hive, and Teradata for backend data operations Optimize Python code for efficient large-scale data processing and transformation Collaborate with Oracle EBS technical teams to customize modules and develop RICE components (Reports, Interfaces, Conversions, Extensions) Construct predictive analytics models using Python for supply chain forecasting and inventory optimization Requirements Proficiency in Python 3.6 or higher, including libraries for data manipulation and machine learning (Pandas, SciPy, NumPy) Experience with Oracle EBS R12 modules, particularly SCM, PO, Inventory, and OPM Strong understanding of SQL, PL/SQL, and Oracle database architecture Knowledge of XML Publisher, Blitz Reports, and Oracle RDF reporting tools
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The SAP HANA Database and Analytics Core engine team is seeking an intermediate or senior developer to contribute to the development of the Knowledge Graph Database System engine. In this role, you will be responsible for designing, developing features, and maintaining the Knowledge Graph engine that operates within the SAP HANA in-memory database. At SAP, all team members, including management, are actively involved in the codebase, fostering a collaborative and hands-on environment. If you believe you possess the requisite skills and experience to excel in such an atmosphere, we encourage you to apply without hesitation. As a developer in this role, you will have the opportunity to: - Contribute to the development of the Knowledge Graph Database System engine within SAP HANA. - Design, develop, and maintain features for the Knowledge Graph engine. - Work closely with a team of experienced professionals to deliver high-performance graph analytics solutions. The team responsible for the HANA Knowledge Graph is dedicated to developing a high-performance graph analytics database system that is utilized by SAP customers, partners, and internal groups as part of the HANA Multi Model Database System. This system is tailored for processing large-scale graph data and executing complex graph queries with optimal efficiency. By leveraging a massive parallel processing (MPP) architecture, the HANA Knowledge Graph maximizes the benefits of distributed computing, adhering to W3C web standards specifications of graph data and query language RDF and SPARQL. The various components of the HANA Knowledge Graph System encompass Storage, Data Load, Query Parsing, Query Planning and Optimization, Query Execution, Transaction Management, Memory Management, Network Communications, System Management, Data Persistence, Backup & Restore, Performance Tuning, and more. This system is poised to play a pivotal role in the development of multiple AI products at SAP. SAP is renowned for its innovative solutions that empower over four hundred thousand customers globally to collaborate more effectively and leverage business insights efficiently. As a market leader in end-to-end business application software, SAP's cloud-based platform, supported by a workforce of over one hundred thousand employees, is committed to driving purposeful innovation and fostering a collaborative team culture. With a focus on inclusion, health, and well-being, SAP ensures that every individual, irrespective of background, is valued and supported to achieve their full potential. SAP is dedicated to creating an inclusive workplace that celebrates diversity and values the unique contributions of each individual. With a commitment to Equal Employment Opportunity, SAP offers accessibility accommodations to applicants with physical and/or mental disabilities. If you require assistance or accommodation during the application process, please contact the Recruiting Operations Team at Careers@sap.com. For SAP employees interested in the referral program, specific conditions apply according to the SAP Referral Policy. Successful candidates may be subject to a background verification conducted by an external vendor. Join SAP in unleashing your full potential and contributing to a more equitable and inclusive world.,
Posted 1 month ago
3.0 - 7.0 years
0 - 0 Lacs
kolkata, west bengal
On-site
The ideal candidate for the role should have hands-on experience in optimizing PL/SQL queries to execute quickly and reduce costs. You should be proficient in writing efficient queries and have previous experience working on RDF and XML Publisher. In addition to a competitive salary, you will also receive an annual bonus and medical insurance coverage for yourself, spouse, and children ranging from 3 to 5 Lakhs. The insurance benefits provided by the company include coverage for pre-existing diseases, all day care procedures, and the option to include family members in the policy. If you possess the necessary skills in PL/SQL Query Tuning, RDF, and XML Publisher, we encourage you to apply for this exciting opportunity.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You are a Senior or Staff AI Scientist responsible for leading the development of domain-specific business ontologies and knowledge engineering infrastructure. Your expertise will be utilized to leverage cutting-edge LLMs, GenAI, and symbolic reasoning to establish an ontology layer, which will serve as the cornerstone of the data platform. This initiative aims to facilitate intelligent data harmonization, enrichment, and decision automation at a large scale. Your key responsibilities will involve spearheading the research and development of ontology learning systems by leveraging state-of-the-art LLMs, structured data mining, and semantic reasoning techniques. You will be tasked with constructing and expanding domain-specific knowledge graphs that seamlessly integrate both external and internal data sources. Additionally, you will drive innovation in semi-automated ontology construction, concept disambiguation, and alignment using GenAI, contrastive learning, and knowledge distillation methodologies. Collaboration across departments is crucial in embedding ontology-driven intelligence into various pipelines, applications, and decision systems. You will work closely with data platform engineers, AI scientists, and product teams to ensure seamless integration and implementation of these intelligent systems. Furthermore, you will be responsible for defining core metrics to evaluate the quality of ontologies, knowledge coverage, and subsequent performance enhancements such as data harmonization and semantic search capabilities. Your role will also involve collaborating with ML infrastructure teams to optimize representation formats (RDF, Knowledge graphs, etc.) and enable scalable, low-latency retrieval and reasoning processes. To qualify for this position, you should hold a PhD or Masters degree in Computer Science, AI, Data Science, or a related field with a focus on knowledge representation, natural language understanding, or data systems. Moreover, you should possess a minimum of 8 years of experience in applied AI or ML research with a specific focus on ontologies, knowledge engineering, entity resolution, or semantic systems. Your expertise should extend to working with LLMs (e.g., GPT, LLaMA, PaLM, Claude) for knowledge extraction, generation, or few-shot learning. A strong background in data engineering, including schema mapping, metadata, master data management, or data pipelines at scale, is also essential. You are expected to have a deep understanding of semantic data models such as ontologies and hands-on experience with relevant libraries or frameworks. Proficiency in programming languages like Python, as well as frameworks like PyTorch or TensorFlow, and graph technologies such as Neo4j, RDFLib, SPARQL, etc., will be beneficial in fulfilling the requirements of this role.,
Posted 1 month ago
2.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
About The Role : Job Title- Data Science Engineer, AS Location- Bangalore, India Role Description We are seeking a Data Science Engineer to contribute to the development of intelligent, autonomous AI systems The ideal candidate will have a strong background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves deploying AI solutions that leverage Retrieval-Augmented Generation (RAG) Multi-agent frameworks Hybrid search techniques to enhance enterprise applications. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 4+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. How well support you
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |