Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
3 - 8 Lacs
Hyderābād
Remote
DESCRIPTION The role is based in Munich, Germany (this is not a remote opportunity). We offer immigration and relocation support. The vision of the Ontology Product Knowledge Team is to provide a standardized, semantically rich, easily discoverable, extensible, and universally applicable body of product knowledge that can be consistently utilized across customer shopping experiences, selling partner listing experiences and internal enrichment of product data. We aim to make product knowledge compelling, easy to use, and feature rich. Our work to build comprehensive product knowledge allows us to semantically understand a customer’s intent – whether that is a shopping mission or a seller offering products. We strive to make these experiences more intuitive for all customers. As an Ontologist, you work on a global team of knowledge builders to deliver world-class, intuitive, and comprehensive taxonomy and ontology models to optimize product discovery for Amazon web and mobile experiences. You collaborate with business partners and engineering teams to deliver knowledge-based solutions to enable product discoverability for customers. In this role, you will directly impact the customer experience as well as the company’s product knowledge foundation. Tasks and Responsibilities: Develop logical, semantically rich, and extensible data models for Amazon's extensive product catalog Ensure our ontologies provide comprehensive domain coverage that are available for both human and machine ingestion and inference Create new schema using Generative Artificial Intelligence (generative AI) models Analyze website metrics and product discovery behaviors to make data-driven decisions on optimizing our knowledge graph data models globally Expand and refine the expansion of data retrieval techniques to utilize our extensive knowledge graph Contribute to team goal setting and future state vision Drive and coordinate cross-functional projects with a broad range of merchandisers, engineers, designers, and other groups that may include architecting new data solutions Develop team operational excellence programs, data quality initiatives and process simplifications Evangelize ontology and semantic technologies within and across teams at Amazon Develop and refine data governance and processes used by global Ontologists Mentor and influence peers Inclusive Team Culture: Our team has a global presence: we celebrate diverse cultures and backgrounds within our team and our customer base. We are committed to furthering our culture of inclusion, offering continuous access to internal affinity groups as well as highlighting diversity programs. Work/Life Harmony: Our team believes that striking the right balance between work and your outside life is key. Our work is not removed from everyday life, but instead is influenced by it. We offer flexibility in working hours and will work with you to facilitate your own balance between your work and personal life. Career Growth: Our team cares about your career growth, from your initial company introduction and training sessions, to continuous support throughout your entire career at Amazon. We recognize each team member as an individual, and we will build on your skills to help you grow. We have a broad mix of experience levels and tenures, and we are building an environment that celebrates knowledge sharing. Perks: You will have the opportunity to support CX used by millions of customers daily and to work with data at a scale very few companies can offer. We have offices around the globe, and have the opportunity to be considered for global placement. You’ll receive on the job training and group development opportunities. BASIC QUALIFICATIONS Degree in Library Science, Information Systems, Linguistics or equivalent professional experience 5+ years of relevant work experience working in ontology and/or taxonomy roles Proven skills in data retrieval and data research techniques Ability to quickly understand complex processes and communicate them in simple language Experience creating and communicating technical requirements to engineering teams Ability to communicate to senior leadership (Director and VP levels) Experience with generative AI (e.g. creating prompts) Knowledge of Semantic Web technologies (RDF/s, OWL), query languages (SPARQL) and validation/reasoning standards (SHACL, SPIN) Knowledge of open-source and commercial ontology engineering editors (e.g. Protege, TopQuadrant products, PoolParty) Detail-oriented problem solver who is able to work in fast-changing environment and manage ambiguity Proven track record of strong communication and interpersonal skills Proficient English language skills PREFERRED QUALIFICATIONS Master’s degree in Library Science, Information Systems, Linguistics or other relevant fields Experience building ontologies in the e-commerce and semantic search spaces Experience working with schema-level constructs (e.g. higher-level classes, punning, property inheritance) Proficiency in SQL, SPARQL Familiarity with software engineering life cycle Familiarity with ontology manipulation programming libraries Exposure to data science and/or machine learning, including graph embedding Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 10 hours ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
Remote
Role-AIML Engineer Location- Remote Expereince-8 to 12 years Notice-Immediate Only Interested candidated share your resume to sunilkumar@xpetize.com Job description: Seeking a highly experienced and technically adept AI/ML Engineer to spearhead a strategic initiative focused on analyzing annual changes in IRS-published TRCs and identifying their downstream impact on codebases. Role demands deep expertise in machine learning, knowledge graph construction, and software engineering processes. The ideal candidate will have a proven track record of delivering production-grade AI solutions in complex enterprise environments. Key Responsibilities: Design and development of an AI/ML-based system to detect and analyze differences in IRS TRC publications year-over-year. Implement knowledge graphs to model relationships between TRC changes and impacted code modules. Collaborate with tax domain experts, software engineers, and DevOps teams to ensure seamless integration of the solution into existing workflows. Define and enforce engineering best practices, including CI/CD, version control, testing, and model governance. Drive the end-to-end lifecycle of the solution—from data ingestion and model training to deployment and monitoring. Ensure scalability, performance, and reliability of the deployed system in a production environment. Mentor junior engineers and contribute to a culture of technical excellence and innovation. Required Skills & Experience: 8+ years of experience in software engineering, with at least 5 years in AI/ML solution delivery. Strong understanding of tax-related data structures, especially IRS TRCs, is a plus. Expertise in building and deploying machine learning models using Python, TensorFlow/PyTorch, and ML Ops frameworks. Hands-on experience with Knowledge graph technologies (e.g., Neo4j, RDF, SPARQL, GraphQL). Deep familiarity with software architecture, microservices, and API design. Experience with NLP techniques for document comparison and semantic analysis. Proven ability to lead cross-functional teams and deliver complex projects on time. Strong communication and stakeholder management skills.
Posted 15 hours ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Reference # 322768BR Job Type Full Time Your role Are you a creative and passionate developer with a knack for building state-of-the-art Front-End solutions? Do you possess an innovative engineering mindset and enjoy leveraging cutting-edge technology to develop diverse solutions? We are seeking a hands-on Full Stack Developer to join UBS Group Technology. In this role, you will contribute to the implementation of a micro-service Front-End for our Enterprise Knowledge Graph, revolutionizing how we handle data within the bank. You will: Design minimalistic, user-friendly interfaces to solve complex problems. Develop innovative micro-sites on top of our enterprise knowledge graph. Enhance existing UI components for reuse across our micro-sites. Your team You will be part of a nimble, multi-disciplinary Data Architecture team within Group CTO, collaborating closely with specialists across various areas of Group Technology. Our team provides the foundation for data-driven management, facilitating processes from strategic and architecture planning to demand management, development, and deployment. The team is globally distributed, with members primarily based in Switzerland, the UK, and the US. Your expertise You have: proven track record in hands-on development and design of Front-End and middleware solutions. strong command of application, data, and infrastructure architecture disciplines. experience working in agile, delivery-oriented teams. Desired: proficiency in JavaScript, Svelte, and CSS. Knowledge of SPARQL and SQL is a plus. experience with graph visualization. expertise in designing and consuming RESTful and GraphQL APIs. experience building modern solutions, including data streaming and cloud-based architectures. familiarity with graph databases (e.g., GraphDB, Anzograph, Jena, Neo4J) is advantageous. experience delivering solutions for Data Analytics users, such as PowerBI. you are: willing to take full ownership of problems and code, with the ability to hit the ground running and deliver exceptional solutions. strong problem solver who anticipates issues and resolves them proactively. skilled in communicating effectively with both technical and non-technical audiences. About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 1 day ago
2.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
About The Role : Job Title- Data Science Engineer, AS Location- Bangalore, India Role Description We are seeking a Data Science Engineer to contribute to the development of intelligent, autonomous AI systems The ideal candidate will have a strong background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves deploying AI solutions that leverage Retrieval-Augmented Generation (RAG) Multi-agent frameworks Hybrid search techniques to enhance enterprise applications. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 4+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. How well support you
Posted 4 days ago
7.0 - 12.0 years
32 - 37 Lacs
Bengaluru
Work from Office
About The Role : Job Title- Data Science Engineer, VP Location- Bangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 13+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for cross-functional collaboration. Ability to work independently and manage multiple projects simultaneously. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 4 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role Overview: We are seeking a motivated Junior AI Testing Engineer to join the team. In this role, you will support the testing of AI models and pipelines, with a special focus on data ingestion into knowledge graphs and knowledge graph administration. You will collaborate with data scientists, engineers, and product teams to ensure the quality, reliability, and performance of AI-driven solutions. Key Responsibilities: AI Model & Pipeline Testing: Design and execute test cases for AI models and data pipelines, ensuring accuracy, stability, and fairness. Knowledge Graph Ingestion: Support the development and testing of Python scripts for data extraction, transformation, and loading (ETL) into enterprise knowledge graphs. Knowledge Graph Administration: Assist in maintaining, monitoring, and troubleshooting knowledge graph environments (e.g., Neo4j, RDF stores), including user access and data integrity. Test Automation: Develop and maintain basic automation scripts (preferably in Python) to streamline testing processes for AI functionalities. Data Quality Assurance: Evaluate and validate the quality of input and output data for AI models, reporting and documenting issues as needed. Bug Reporting & Documentation: Identify, document, and communicate bugs or issues discovered during testing. Maintain clear testing documentation and reports. Collaboration: Work closely with knowledge graph engineers, data scientists, and product managers to understand requirements and deliver robust solutions. Requirements: Education: Bachelor’s degree in Computer Science, Information Technology, or related field. Experience: Ideally some experience in software/AI testing, data engineering, or a similar technical role. Technical Skills: Proficient in Python (must-have) Experience with test case design, execution, and bug reporting Exposure to knowledge graph technologies (e.g., Neo4j, RDF, SPARQL) and data ingestion/ETL processes Analytical & Problem-Solving Skills: Strong attention to detail, ability to analyze data and systems, and troubleshoot issues. Communication: Clear verbal and written communication skills for documentation and collaboration. Preferred Qualifications: Experience with graph query languages (e.g., Cypher, SPARQL) Exposure to cloud platforms (AWS, Azure, GCP) and CI/CD workflows Familiarity with data quality and governance practices
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
We help the world run better by enabling individuals to bring out their best at SAP. Our company culture is centered around collaboration and a shared passion for improving the world's operations. We strive each day to lay the groundwork for the future, fostering a workplace that celebrates diversity, values flexibility, and is dedicated to purpose-driven and forward-thinking work. Within our highly collaborative and supportive team environment, we prioritize learning and development, recognize individual contributions, and offer a range of benefit options for our employees. The SAP HANA Database and Analytics Core engine team is currently seeking an intermediate or senior developer to join our Knowledge Graph Database System engine development efforts. In this role, you will be responsible for designing, developing features, and maintaining our Knowledge Graph engine, which operates within the SAP HANA in-memory database. At SAP, all members of our engineering team, including management, are hands-on and deeply involved in the coding process. If you believe you can thrive in such an environment and possess the required skills and experience, we encourage you to apply without hesitation. As a developer on our team, you will have the opportunity to contribute to: The team you will be working with is responsible for the development of the HANA Knowledge Graph, a high-performance graph analytics database system that is accessible to SAP customers, partners, and internal groups as part of the HANA Multi Model Database System. This system is designed to handle large-scale graph data processing and execute complex graph queries with exceptional efficiency. By leveraging massive parallel processing (MPP) architecture and adhering to W3C web standards specifications for graph data and query language RDF and SPARQL, the HANA Knowledge Graph enables organizations to extract insights from their graph datasets, identify patterns, conduct advanced graph analytics, and unlock the value of interconnected data. Key components of the HANA Knowledge Graph System include but are not limited to Storage, Data Load, Query Parsing, Query Planning and Optimization, Query Execution, Transaction Management, Memory Management, Network Communications, System Management, Data Persistence, Backup & Restore, Performance Tuning, and more. This system is poised to play a crucial role in the development of various AI products at SAP. At SAP, we believe in fostering a culture of inclusion, prioritizing the health and well-being of our employees, and offering flexible working models to ensure that everyone, regardless of background, feels valued and can perform at their best. We are dedicated to leveraging the unique capabilities and qualities that each individual brings to our company, investing in our employees" growth, and creating a more equitable world. SAP is an equal opportunity workplace and an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you require accommodation or special assistance during the application process, please reach out to our Recruiting Operations Team at Careers@sap.com. For SAP employees, only permanent roles are eligible for the SAP Employee Referral Program, subject to the eligibility rules outlined in the SAP Referral Policy. Specific conditions may apply to roles in Vocational Training. Successful candidates may be subject to a background verification conducted by an external vendor. Requisition ID: 396628 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
delhi
On-site
You are a Data Integration & Modeling Specialist responsible for developing common metamodels, defining integration specifications, and working with semantic web technologies and various data formats. Your expertise will contribute to enterprise-level data integration and standardization initiatives. Your key responsibilities include developing common metamodels by integrating requirements across systems and organizations, defining integration specifications, establishing data standards, and developing logical and physical data models. You will collaborate with stakeholders to align data architectures with organizational needs and industry best practices. Additionally, you will implement and govern semantic data solutions using RDF and SPARQL, perform data transformations and scripting using TCL, Python, and Java, and work with data formats like FRL, VRL, HRL, XML, and JSON to support integration and processing pipelines. Documenting technical specifications and providing guidance on data standards and modeling best practices are also part of your role. The qualifications required for this position include: - 3+ years of experience in developing common metamodels, preferably using NIEM standards. - 3+ years of experience in defining integration specifications, developing data models, and governing data standards within the last 8 years. - 2+ years of recent experience with Tool Command Language (TCL), Python, and Java. - 2+ years of experience with Resource Description Framework (RDF) and SPARQL Query Language. - 2+ years of experience working with Fixed Record Layout (FRL), Variable Record Layout (VRL), Hierarchical Record Layout (HRL), XML, and JSONodeling Specialist. This is a contract position located remotely with a duration of 6 months. Join us to bring your technical expertise and collaborative mindset to drive successful data integration and modeling initiatives.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for designing architectures for meta-learning, self-reflective agents, and recursive optimization loops. Your role will involve building simulation frameworks for behavior grounded in Bayesian dynamics, attractor theory, and teleo-dynamics. Additionally, you will develop systems that integrate graph rewriting, knowledge representation, and neurosymbolic reasoning. Conducting research on fractal intelligence structures, swarm-based agent coordination, and autopoietic systems will be part of your responsibilities. You are expected to advance Mobius's knowledge graph with ontologies supporting logic, agency, and emergent semantics. Integration of logic into distributed, policy-scoped decision graphs aligned with business and ethical constraints is crucial. Furthermore, publishing cutting-edge results and mentoring contributors in reflective system design and emergent AI theory will be part of your duties. Lastly, building scalable simulations of multi-agent, goal-directed, and adaptive ecosystems within the Mobius runtime is an essential aspect of the role. In terms of qualifications, you should have proven expertise in meta-learning, recursive architectures, and AI safety. Proficiency in distributed systems, multi-agent environments, and decentralized coordination is necessary. Strong implementation skills in Python are required, with additional proficiency in C++, functional, or symbolic languages being a plus. A publication record in areas intersecting AI research, complexity science, and/or emergent systems is also desired. Preferred qualifications include experience with neurosymbolic architectures and hybrid AI systems, fractal modeling, attractor theory, complex adaptive dynamics, topos theory, category theory, logic-based semantics, knowledge ontologies, OWL/RDF, semantic reasoners, autopoiesis, teleo-dynamics, biologically inspired system design, swarm intelligence, self-organizing behavior, emergent coordination, and distributed learning systems. In terms of technical proficiency, you should be proficient in programming languages such as Python (required), C++, Haskell, Lisp, or Prolog (preferred for symbolic reasoning), frameworks like PyTorch and TensorFlow, distributed systems including Ray, Apache Spark, Dask, Kubernetes, knowledge technologies like Neo4j, RDF, OWL, SPARQL, experiment management tools like MLflow, Weights & Biases, and GPU and HPC systems like CUDA, NCCL, Slurm. Familiarity with formal modeling tools like Z3, TLA+, Coq, Isabelle is also beneficial. Your core research domains will include recursive self-improvement and introspective AI, graph theory, graph rewriting, and knowledge graphs, neurosymbolic systems and ontological reasoning, fractal intelligence and dynamic attractor-based learning, Bayesian reasoning under uncertainty and cognitive dynamics, swarm intelligence and decentralized consensus modeling, top os theory, and the abstract structure of logic spaces, autopoietic, self-sustaining system architectures, and teleo-dynamics and goal-driven adaptation in complex systems.,
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Why MResult? Founded in 2004, MResult is a global digital solutions partner trusted by leading Fortune 500 companies in industries such as pharma & healthcare, retail, and BFSI. MResult’s expertise in data and analytics, data engineering, machine learning, AI, and automation help companies streamline operations and unlock business value. As part of our team, you will collaborate with top minds in the industry to deliver cutting-edge solutions that solve real-world challenges. What We Offer: At MResult, you can leave your mark on projects at the world’s most recognized brands, access opportunities to grow and upskill, and do your best work with the flexibility of hybrid work models. Great work is rewarded, and leaders are nurtured from within. Our values — Agility, Collaboration, Client Focus, Innovation, and Integrity — are woven into our culture, guiding every decision. What This Role Requires: In this role, you will be a key contributor to MResult’s mission of empowering our clients with data-driven insights and innovative digital solutions. Each day brings exciting challenges and growth opportunities. Here is what you will do: R oles and Responsibilities: • Design and implement machine learning models for classification, regression, clustering, NLP, or computer vision tasks. • Build and maintain scalable ML pipelines for training, validation, and deployment. • Optimize model performance and ensure robustness in production environments. • Build and fine-tune large language models (LLMs) for tasks such as summarization, Q&A, content generation, and code synthesis. • Relational AI experience- Design and implement relational knowledge graphs to model complex data relationships. • Use declarative logic and reasoning frameworks to extract insights from structured data. • Integrate AI models into production systems using APIs or microservices. • Work with DevOps/ML-Ops teams to automate model deployment and monitoring. • Ensure model explainability, fairness, and compliance with ethical AI standards. • Work cross-functionally with product managers, analysts, and engineers. • Document model architecture, data pipelines, and performance metrics. Driving Innovation, Empowering Insights • Participate in code reviews and contribute to best practices in AI/ML strategic development. Key Skills to Succeed in This Role: • Bachelor's or master’s degree in computer science, Data Science or AI related field. • 5+ years of relevant experience in AI/ML engineering roles. • Hands-on experience with: • Generative AI models (e.g., GPT, LLaMA, Claude, DALL·E) • Python, PyTorch/TensorFlow, Hugging Face Transformers • Relational data modeling and knowledge graph tools (e.g., SQL, RDF, SPARQL, RelationalAI) • Prototyping tools (Streamlit, Gradio, Flask) • Cloud platforms (AWS, Azure) • Strong understanding of NLP, deep learning, and model evaluation techniques. • Experience with vector databases (e.g., FAISS, Pinecone) and retrieval-augmented generation (RAG). • Familiarity with LangChain, LlamaIndex, or similar GenAI orchestration frameworks. • Exposure to CI/CD, Docker, and Kubernetes for ML deployment. • Strong communication and collaboration skills. Manage, Master, and Maximize with MResult: MResult is an equal-opportunity employer committed to building an inclusive environment free of discrimination and harassment. Take the next step in your career with MResult — where your ideas help shape the future
Posted 2 weeks ago
0.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Location Chennai, Tamil Nadu, India Job ID R-231576 Date posted 15/07/2025 Job Title: Senior Consultant – Data Engineer (Knowledge Graph) Career Level: D2 Introduction to role Are you ready to disrupt an industry and change lives? We are seeking a dynamic Senior Consultant – Data Engineer with expertise in knowledge graph concepts to join our team. Your work will have a direct impact on our ability to develop life-changing medicines, empowering the business to perform at its peak. Dive into a world where cutting-edge science meets leading digital technology platforms and data, all with a passion for impacting lives through data, analytics, AI, machine learning, and more. Accountabilities Collaborate with project teams across diverse domains to understand their data needs and provide expertise in data ingestion and enrichment processes. Design, develop, and maintain scalable data pipelines and ETL workflows for the Knowledge Graph Team. Implement advanced data engineering techniques to ensure optimal performance and reliability of data systems. Work closely with data scientists and analysts to ensure high-quality data for knowledge graph construction and advanced analytics. Troubleshoot and resolve complex issues related to data pipelines, ensuring efficient data flow. Optimize data storage and processing for performance, scalability, and cost-efficiency. Stay updated with the latest trends in data engineering, analytics, and AWS DevOps to drive innovation. Provide DevOps/CloudOps support for the Knowledge Graph Team as needed. Essential Skills/Experience Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Strong hands-on programming experience (preferably Python) Experience working with any relational database (e.g., PostgreSQL, MySQL, SQL Server) Proficient in version control using GIT Solid understanding of data engineering principles and standard methodologies Desirable Skills/Experience Practical experience with Knowledge Graphs (RDF or LPG models) Proficiency in graph query languages such as SPARQL, Cypher, or Gremlin Hands-on experience with AWS services (e.g., S3, EC2, Lambda, Glue) Experience working with Snowflake for data warehousing or analytics Familiarity with Docker and containerized deployments Experience with data transformation tools like DBT Experience with data orchestration tools such as Apache Airflow Understanding of CI/CD and DevOps practices Knowledge of FAIR (Findable, Accessible, Interoperable, Reusable) data principles When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we demonstrate technology to impact patients and ultimately save lives. As part of a purpose-led global organization, we push the boundaries of science to discover and develop life-changing medicines. Our work unlocks the potential of science by improving efficiencies and driving productivity through automation and data simplification. With investment behind us, there's no slowing us down—join us at a crucial stage of our journey in becoming a digital and data-led enterprise. Ready to make a meaningful impact? Apply now and be part of our innovative team! Date Posted 16-Jul-2025 Closing Date AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Seeking a highly experienced and technically adept AI/ML Engineer to spearhead a strategic initiative focused on analyzing annual changes in IRS-published TRCs and identifying their downstream impact on codebases. Role demands deep expertise in machine learning, knowledge graph construction, and software engineering processes. The ideal candidate will have a proven track record of delivering production-grade AI solutions in complex enterprise environments. Key Responsibilities: Design and development of an AI/ML-based system to detect and analyze differences in IRS TRC publications year-over-year. Implement knowledge graphs to model relationships between TRC changes and impacted code modules. Collaborate with tax domain experts, software engineers, and DevOps teams to ensure seamless integration of the solution into existing workflows. Define and enforce engineering best practices, including CI/CD, version control, testing, and model governance. Drive the end-to-end lifecycle of the solution—from data ingestion and model training to deployment and monitoring. Ensure scalability, performance, and reliability of the deployed system in a production environment. Mentor junior engineers and contribute to a culture of technical excellence and innovation. Required Skills & Experience: 8+ years of experience in software engineering, with at least 5 years in AI/ML solution delivery. Strong understanding of tax-related data structures, especially IRS TRCs, is a plus. Expertise in building and deploying machine learning models using Python, TensorFlow/PyTorch, and ML Ops frameworks. Hands-on experience with Knowledge graph technologies (e.g., Neo4j, RDF, SPARQL, GraphQL). Deep familiarity with software architecture, microservices, and API design. Experience with NLP techniques for document comparison and semantic analysis. Proven ability to lead cross-functional teams and deliver complex projects on time. Strong communication and stakeholder management skills. Preferred Qualifications: Experience working on regulatory or compliance-driven AI applications. Familiarity with code analysis tools and static/dynamic code mapping techniques. Exposure to cloud platforms (Azure, AWS, GCP) and containerization (Docker, Kubernetes). Contributions to open-source AI/ML or graph-based projects. Skill Set Required -Must Have 1. AI/ML 2. Python, TensorFlow/PyTorch, and ML Ops frameworks 3 Knowledge graph technologies 4 Data migration testing 5 Azure DevOps 6 Azure AI 7 US Tax understanding
Posted 2 weeks ago
6.0 - 8.0 years
7 - 12 Lacs
India, Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)
Posted 3 weeks ago
6.0 - 8.0 years
5 - 9 Lacs
India, Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore , where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)
Posted 3 weeks ago
6.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
: Job Title- Data Science Engineer, AVP Location- Bangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 8+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for cross-functional collaboration. Ability to work independently and manage multiple projects simultaneously. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
5.0 - 9.0 years
10 - 16 Lacs
Pune, Greater Noida, Delhi / NCR
Work from Office
Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Write clean, efficient, and well-documented code. Troubleshoot and fix bugs. Collaborate with other developers and stakeholders to deliver high-quality solutions. Stay up-to-date with the latest technologies and trends. Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Strong problem-solving and debugging skills. Ability to work independently and as part of a team. Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure. Share your resume at Aarushi.Shukla@coforge.com if you are an early or immediate joiner.
Posted 3 weeks ago
15.0 - 20.0 years
37 - 45 Lacs
Bengaluru
Work from Office
: Job TitleSenior Data Science Engineer Lead LocationBangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, Vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI ApplicationsUtilize frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language ModelsCustomize LLMs and SLMs using domain-specific data to improve performance and relevance in specialized applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-FunctionallyWork with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimize AI WorkflowsEmploy MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring Your skills and experience 15+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs, and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or ecommerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for crossfunctional collaboration. Ability to work independently and manage multiple projects simultaneously How well support you
Posted 3 weeks ago
5.0 - 8.0 years
30 - 40 Lacs
Bengaluru
Work from Office
Data Engineer and Developer We are seeking a Data Engineer who will define and build the foundational architecture for our data platform the bedrock upon which our applications will thrive. You all will collaborate closely with application developers, translating their needs into platform capabilities that turbocharge development. From the start, you all will architect for scale, ensuring our data flows seamlessly through every stage of its lifecycle: collection, modeling, cleansing, enrichment, securing, and storing data in an optimal format. Think of yourself as the mastermind orchestrating an evolving data ecosystem, engineered to adapt and excel amid tomorrow's challenges. We are looking for a Data Engineer with 5+ years of experience who has: Database Versatility: Deep expertise working with relational databases (PostgreSQL, MS SQL, and beyond) as well as NoSQL systems (such as MongoDB, Cassandra, Elasticsearch). Graph Database: Design and implement scalable graph databases to model complex relationships between entities for use in GenAI agent architectures using Neo4J, Dgraph, ArangoDB and query languages such as Cypher, SPARQL, GraphQL. Data Lifecycle Expertise: Skilled in all aspects of data management collection, storage, integration, quality, and pipeline design. Programming Proficiency: Adept in programming languages such as Python, Go. Collaborative Mindset: Experienced in partnering with GenAI Engineers and Data Scientists. Modern Data Paradigms: A strong grasp of Data Mesh and Data Products, Data Fabric. Understanding of Data Ops and Domain Driven Design (DDD) is a plus.
Posted 3 weeks ago
6.0 - 8.0 years
0 Lacs
Bengaluru
On-site
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary Leadership: Ability to lead the team towards long-term technical goals while managing immediate priorities. Strong Communication: Good interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & Coaching: Foster a culture of continuous learning, skill development, and collaboration within the team. Conflict Resolution: Ability to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)
Posted 3 weeks ago
6.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary Leadership: Ability to lead the team towards long-term technical goals while managing immediate priorities. Strong Communication: Good interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & Coaching: Foster a culture of continuous learning, skill development, and collaboration within the team. Conflict Resolution: Ability to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)
Posted 3 weeks ago
4.0 - 9.0 years
10 - 19 Lacs
Pune, Greater Noida, Delhi / NCR
Work from Office
Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Mandatory Skills- Python, RDF, Neo4J, GraphDB, Version Control System, API Frameworks Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure.
Posted 3 weeks ago
6.0 - 8.0 years
5 - 5 Lacs
Bengaluru
On-site
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary Leadership: Ability to lead the team towards long-term technical goals while managing immediate priorities. Strong Communication: Good interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & Coaching: Foster a culture of continuous learning, skill development, and collaboration within the team. Conflict Resolution: Ability to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore , where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)
Posted 3 weeks ago
3.0 - 6.0 years
5 - 5 Lacs
Bengaluru
On-site
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Junior Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 3 - 6 years’ experience on developing software solutions with various Application programming languages. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Experience in expert functions like Software Development / Architecture Strong knowledge of Python fundamentals, including object-oriented programming, data structures, and algorithms. Exposure in Semantics, Knowledge Graphs, Data modelling and Ontologies will be preferred up and above. Proficiency in Python-based web frameworks such as Flask, FAST API. Experience in building and consuming RESTful APIs. Knowledge of web technologies like HTML, CSS, and JavaScript (basic understanding for integration purposes). Experience with libraries such as RDFLib or Py2neo for building and querying knowledge graphs. Familiarity with SPARQL for querying data from knowledge graphs. Understanding of graph databases like Neo4j, GraphDB, or Blazegraph. Experience with version control systems like Git. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab CI. Experience with containerization (Docker) and orchestration (Kubernetes). Familiar with basics of AWS Services – S3 Familiar with AWS Neptune Excellent command over English in written, spoken communication and strong presentation skills Good to have Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. Strong experience in Data Engineering and Analytics (Optional) Experience and exposure to Testing Frameworks, Unit Test cases Expert in Data Engineering and building data pipelines, implementing Algorithms in a distributed environment Very good experience with data science and machine learning (Optional) Experience with developing and deploying web applications on the cloud with solid understanding of one or more of the following Django (Optional) Drive adoption of Cloud technology for data processing and warehousing Experience in working with multiple databases, especially with NoSQL world Understanding of Webserver, Load Balancer and deployment process / activities Advanced level knowledge of software development life cycle. Advanced level knowledge of software engineering process. Experience in Jira, Confluence will be an added advantage. Experience with Agile/Lean development methods using Scrum Experience in Rapid Programming techniques and TDD (Optional) Takes strong initiatives and highly result oriented Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Ready to travel for Onsite Job assignments (short to long term) Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)
Posted 3 weeks ago
3.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Junior Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 3 - 6 years’ experience on developing software solutions with various Application programming languages. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Experience in expert functions like Software Development / Architecture Strong knowledge of Python fundamentals, including object-oriented programming, data structures, and algorithms. Exposure in Semantics, Knowledge Graphs, Data modelling and Ontologies will be preferred up and above. Proficiency in Python-based web frameworks such as Flask, FAST API. Experience in building and consuming RESTful APIs. Knowledge of web technologies like HTML, CSS, and JavaScript (basic understanding for integration purposes). Experience with libraries such as RDFLib or Py2neo for building and querying knowledge graphs. Familiarity with SPARQL for querying data from knowledge graphs. Understanding of graph databases like Neo4j, GraphDB, or Blazegraph. Experience with version control systems like Git. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab CI. Experience with containerization (Docker) and orchestration (Kubernetes). Familiar with basics of AWS Services – S3 Familiar with AWS Neptune Excellent command over English in written, spoken communication and strong presentation skills Good to have Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. Strong experience in Data Engineering and Analytics (Optional) Experience and exposure to Testing Frameworks, Unit Test cases Expert in Data Engineering and building data pipelines, implementing Algorithms in a distributed environment Very good experience with data science and machine learning (Optional) Experience with developing and deploying web applications on the cloud with solid understanding of one or more of the following Django (Optional) Drive adoption of Cloud technology for data processing and warehousing Experience in working with multiple databases, especially with NoSQL world Understanding of Webserver, Load Balancer and deployment process / activities Advanced level knowledge of software development life cycle. Advanced level knowledge of software engineering process. Experience in Jira, Confluence will be an added advantage. Experience with Agile/Lean development methods using Scrum Experience in Rapid Programming techniques and TDD (Optional) Takes strong initiatives and highly result oriented Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Ready to travel for Onsite Job assignments (short to long term) Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)
Posted 3 weeks ago
6.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means- trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange- discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOULL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: - International experience with global projects and collaboration with intercultural team is preferred - 6 - 8 years experience on developing software solutions with Python language. - Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POCs - Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. - Provide technical guidance and mentorship to junior developers, fostering their growth and development. - Design scalable and efficient architectures for web applications, knowledge graphs, and database models. - Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. - Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. - Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. - Experience in designing and developing complex RESTful APIs and microservices architectures. - Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). - Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. - Proficiency in SPARQL for advanced graph data querying. - Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune - Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) - Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). - Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune - Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES - Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. - Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. - Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. - Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where youll get the chance to work with teams impacting entire cities, countries- and the craft of things to come. Were Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough