Jobs
Interviews

111 Sparql Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Seeking a highly experienced and technically adept AI/ML Engineer to spearhead a strategic initiative focused on analyzing annual changes in IRS-published TRCs and identifying their downstream impact on codebases. Role demands deep expertise in machine learning, knowledge graph construction, and software engineering processes. The ideal candidate will have a proven track record of delivering production-grade AI solutions in complex enterprise environments. Key Responsibilities: Design and development of an AI/ML-based system to detect and analyze differences in IRS TRC publications year-over-year. Implement knowledge graphs to model relationships between TRC changes and impacted code modules. Collaborate with tax domain experts, software engineers, and DevOps teams to ensure seamless integration of the solution into existing workflows. Define and enforce engineering best practices, including CI/CD, version control, testing, and model governance. Drive the end-to-end lifecycle of the solution—from data ingestion and model training to deployment and monitoring. Ensure scalability, performance, and reliability of the deployed system in a production environment. Mentor junior engineers and contribute to a culture of technical excellence and innovation. Required Skills & Experience: 8+ years of experience in software engineering, with at least 5 years in AI/ML solution delivery. Strong understanding of tax-related data structures, especially IRS TRCs, is a plus. Expertise in building and deploying machine learning models using Python, TensorFlow/PyTorch, and ML Ops frameworks. Hands-on experience with Knowledge graph technologies (e.g., Neo4j, RDF, SPARQL, GraphQL). Deep familiarity with software architecture, microservices, and API design. Experience with NLP techniques for document comparison and semantic analysis. Proven ability to lead cross-functional teams and deliver complex projects on time. Strong communication and stakeholder management skills. Preferred Qualifications: Experience working on regulatory or compliance-driven AI applications. Familiarity with code analysis tools and static/dynamic code mapping techniques. Exposure to cloud platforms (Azure, AWS, GCP) and containerization (Docker, Kubernetes). Contributions to open-source AI/ML or graph-based projects. Skill Set Required -Must Have 1. AI/ML 2. Python, TensorFlow/PyTorch, and ML Ops frameworks 3 Knowledge graph technologies 4 Data migration testing 5 Azure DevOps 6 Azure AI 7 US Tax understanding

Posted 2 months ago

Apply

6.0 - 8.0 years

7 - 12 Lacs

India, Bengaluru

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)

Posted 2 months ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

India, Bengaluru

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore , where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)

Posted 2 months ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

: Job Title- Data Science Engineer, AVP Location- Bangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI Applications:Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models:Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally:Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows:Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 8+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for cross-functional collaboration. Ability to work independently and manage multiple projects simultaneously. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

5.0 - 9.0 years

10 - 16 Lacs

Pune, Greater Noida, Delhi / NCR

Work from Office

Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Write clean, efficient, and well-documented code. Troubleshoot and fix bugs. Collaborate with other developers and stakeholders to deliver high-quality solutions. Stay up-to-date with the latest technologies and trends. Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Strong problem-solving and debugging skills. Ability to work independently and as part of a team. Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure. Share your resume at Aarushi.Shukla@coforge.com if you are an early or immediate joiner.

Posted 2 months ago

Apply

15.0 - 20.0 years

37 - 45 Lacs

Bengaluru

Work from Office

: Job TitleSenior Data Science Engineer Lead LocationBangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, Vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI ApplicationsUtilize frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language ModelsCustomize LLMs and SLMs using domain-specific data to improve performance and relevance in specialized applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-FunctionallyWork with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimize AI WorkflowsEmploy MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring Your skills and experience 15+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs, and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or ecommerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for crossfunctional collaboration. Ability to work independently and manage multiple projects simultaneously How well support you

Posted 2 months ago

Apply

5.0 - 8.0 years

30 - 40 Lacs

Bengaluru

Work from Office

Data Engineer and Developer We are seeking a Data Engineer who will define and build the foundational architecture for our data platform the bedrock upon which our applications will thrive. You all will collaborate closely with application developers, translating their needs into platform capabilities that turbocharge development. From the start, you all will architect for scale, ensuring our data flows seamlessly through every stage of its lifecycle: collection, modeling, cleansing, enrichment, securing, and storing data in an optimal format. Think of yourself as the mastermind orchestrating an evolving data ecosystem, engineered to adapt and excel amid tomorrow's challenges. We are looking for a Data Engineer with 5+ years of experience who has: Database Versatility: Deep expertise working with relational databases (PostgreSQL, MS SQL, and beyond) as well as NoSQL systems (such as MongoDB, Cassandra, Elasticsearch). Graph Database: Design and implement scalable graph databases to model complex relationships between entities for use in GenAI agent architectures using Neo4J, Dgraph, ArangoDB and query languages such as Cypher, SPARQL, GraphQL. Data Lifecycle Expertise: Skilled in all aspects of data management collection, storage, integration, quality, and pipeline design. Programming Proficiency: Adept in programming languages such as Python, Go. Collaborative Mindset: Experienced in partnering with GenAI Engineers and Data Scientists. Modern Data Paradigms: A strong grasp of Data Mesh and Data Products, Data Fabric. Understanding of Data Ops and Domain Driven Design (DDD) is a plus.

Posted 2 months ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru

On-site

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary Leadership: Ability to lead the team towards long-term technical goals while managing immediate priorities. Strong Communication: Good interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & Coaching: Foster a culture of continuous learning, skill development, and collaboration within the team. Conflict Resolution: Ability to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)

Posted 2 months ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary Leadership: Ability to lead the team towards long-term technical goals while managing immediate priorities. Strong Communication: Good interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & Coaching: Foster a culture of continuous learning, skill development, and collaboration within the team. Conflict Resolution: Ability to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)

Posted 2 months ago

Apply

4.0 - 9.0 years

10 - 19 Lacs

Pune, Greater Noida, Delhi / NCR

Work from Office

Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Mandatory Skills- Python, RDF, Neo4J, GraphDB, Version Control System, API Frameworks Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure.

Posted 2 months ago

Apply

6.0 - 8.0 years

5 - 5 Lacs

Bengaluru

On-site

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 6 - 8 years’ experience on developing software solutions with Python language. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. Provide technical guidance and mentorship to junior developers, fostering their growth and development. Design scalable and efficient architectures for web applications, knowledge graphs, and database models. Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. Experience in designing and developing complex RESTful APIs and microservices architectures. Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. Proficiency in SPARQL for advanced graph data querying. Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES Visionary Leadership: Ability to lead the team towards long-term technical goals while managing immediate priorities. Strong Communication: Good interpersonal skills to work effectively with both technical and non-technical stakeholders. Mentorship & Coaching: Foster a culture of continuous learning, skill development, and collaboration within the team. Conflict Resolution: Ability to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore , where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)

Posted 2 months ago

Apply

3.0 - 6.0 years

5 - 5 Lacs

Bengaluru

On-site

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Junior Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 3 - 6 years’ experience on developing software solutions with various Application programming languages. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Experience in expert functions like Software Development / Architecture Strong knowledge of Python fundamentals, including object-oriented programming, data structures, and algorithms. Exposure in Semantics, Knowledge Graphs, Data modelling and Ontologies will be preferred up and above. Proficiency in Python-based web frameworks such as Flask, FAST API. Experience in building and consuming RESTful APIs. Knowledge of web technologies like HTML, CSS, and JavaScript (basic understanding for integration purposes). Experience with libraries such as RDFLib or Py2neo for building and querying knowledge graphs. Familiarity with SPARQL for querying data from knowledge graphs. Understanding of graph databases like Neo4j, GraphDB, or Blazegraph. Experience with version control systems like Git. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab CI. Experience with containerization (Docker) and orchestration (Kubernetes). Familiar with basics of AWS Services – S3 Familiar with AWS Neptune Excellent command over English in written, spoken communication and strong presentation skills Good to have Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. Strong experience in Data Engineering and Analytics (Optional) Experience and exposure to Testing Frameworks, Unit Test cases Expert in Data Engineering and building data pipelines, implementing Algorithms in a distributed environment Very good experience with data science and machine learning (Optional) Experience with developing and deploying web applications on the cloud with solid understanding of one or more of the following Django (Optional) Drive adoption of Cloud technology for data processing and warehousing Experience in working with multiple databases, especially with NoSQL world Understanding of Webserver, Load Balancer and deployment process / activities Advanced level knowledge of software development life cycle. Advanced level knowledge of software engineering process. Experience in Jira, Confluence will be an added advantage. Experience with Agile/Lean development methods using Scrum Experience in Rapid Programming techniques and TDD (Optional) Takes strong initiatives and highly result oriented Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Ready to travel for Onsite Job assignments (short to long term) Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)

Posted 2 months ago

Apply

3.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like you’d make a great addition to our vibrant team. We are looking for Junior Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environment? What is the user story based on? Implementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOU’LL MAKE A DIFFERENCE BY: Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ SKILLS: International experience with global projects and collaboration with intercultural team is preferred 3 - 6 years’ experience on developing software solutions with various Application programming languages. Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POC’s Experience in expert functions like Software Development / Architecture Strong knowledge of Python fundamentals, including object-oriented programming, data structures, and algorithms. Exposure in Semantics, Knowledge Graphs, Data modelling and Ontologies will be preferred up and above. Proficiency in Python-based web frameworks such as Flask, FAST API. Experience in building and consuming RESTful APIs. Knowledge of web technologies like HTML, CSS, and JavaScript (basic understanding for integration purposes). Experience with libraries such as RDFLib or Py2neo for building and querying knowledge graphs. Familiarity with SPARQL for querying data from knowledge graphs. Understanding of graph databases like Neo4j, GraphDB, or Blazegraph. Experience with version control systems like Git. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab CI. Experience with containerization (Docker) and orchestration (Kubernetes). Familiar with basics of AWS Services – S3 Familiar with AWS Neptune Excellent command over English in written, spoken communication and strong presentation skills Good to have Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. Strong experience in Data Engineering and Analytics (Optional) Experience and exposure to Testing Frameworks, Unit Test cases Expert in Data Engineering and building data pipelines, implementing Algorithms in a distributed environment Very good experience with data science and machine learning (Optional) Experience with developing and deploying web applications on the cloud with solid understanding of one or more of the following Django (Optional) Drive adoption of Cloud technology for data processing and warehousing Experience in working with multiple databases, especially with NoSQL world Understanding of Webserver, Load Balancer and deployment process / activities Advanced level knowledge of software development life cycle. Advanced level knowledge of software engineering process. Experience in Jira, Confluence will be an added advantage. Experience with Agile/Lean development methods using Scrum Experience in Rapid Programming techniques and TDD (Optional) Takes strong initiatives and highly result oriented Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Ready to travel for Onsite Job assignments (short to long term) Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we encourage applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here: www.siemens.com/careers/digitalminds (http://www.siemens.com/careers/digitalminds)

Posted 2 months ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means- trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange- discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOULL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: - International experience with global projects and collaboration with intercultural team is preferred - 6 - 8 years experience on developing software solutions with Python language. - Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POCs - Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. - Provide technical guidance and mentorship to junior developers, fostering their growth and development. - Design scalable and efficient architectures for web applications, knowledge graphs, and database models. - Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. - Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. - Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. - Experience in designing and developing complex RESTful APIs and microservices architectures. - Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). - Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. - Proficiency in SPARQL for advanced graph data querying. - Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune - Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) - Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). - Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune - Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES - Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. - Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. - Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. - Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where youll get the chance to work with teams impacting entire cities, countries- and the craft of things to come. Were Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)

Posted 2 months ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

: Job Title Data Science_AI Engineer , AVP LocationBangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design & Develop Agentic AI ApplicationsUtilize frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language ModelsCustomize LLMs and SLMs using domain-specific data to improve performance and relevance in specialized applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-FunctionallyWork with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimize AI WorkflowsEmploy MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your skills and experience 10+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs, and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. For internal use only Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or e- commerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for crossfunctional collaboration. Ability to work independently and manage multiple projects simultaneously. How well support you

Posted 2 months ago

Apply

6.0 years

15 - 17 Lacs

India

Remote

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company Operating at the intersection of Artificial Intelligence, Cloud Infrastructure, and Enterprise SaaS , we create data-driven products that power decision-making for Fortune 500 companies and high-growth tech firms. Our multidisciplinary teams ship production-grade generative-AI and Retrieval-Augmented Generation (RAG) solutions that transform telecom, finance, retail, and healthcare workflows—without compromising on scale, security, or speed. Role & Responsibilities Build & ship LLM/RAG solutions: design, train, and productionize advanced ML and generative-AI models (GPT-family, T5) that unlock new product capabilities. Own data architecture: craft schemas, ETL/ELT pipelines, and governance processes to guarantee high-quality, compliant training data on AWS. End-to-end MLOps: implement CI/CD, observability, and automated testing (Robot Framework, JMeter, XRAY) for reliable model releases. Optimize retrieval systems: engineer vector indices, semantic search, and knowledge-graph integrations that deliver low-latency, high-relevance results. Cross-functional leadership: translate business problems into measurable ML solutions, mentor junior scientists, and drive sprint ceremonies. Documentation & knowledge-sharing: publish best practices and lead internal workshops to scale AI literacy across the organization. Skills & Qualifications Must-Have – Technical Depth: 6 + years building ML pipelines in Python; expert in feature engineering, evaluation, and AWS services (SageMaker, Bedrock, Lambda). Must-Have – Generative AI & RAG: proven track record shipping LLM apps with LangChain or similar, vector databases, and synthetic-data augmentation. Must-Have – Data Governance: hands-on experience with metadata, lineage, data-cataloging, and knowledge-graph design (RDF/OWL/SPARQL). Must-Have – MLOps & QA: fluency in containerization, CI/CD, and performance testing; ability to embed automation within GitLab-based workflows. Preferred – Domain Expertise: background in telecom or large-scale B2B platforms where NLP and retrieval quality are mission-critical. Preferred – Full-Stack & Scripting: familiarity with Angular or modern JS for rapid prototyping plus shell scripting for orchestration. Benefits & Culture Highlights High-impact ownership: green-field autonomy to lead flagship generative-AI initiatives used by millions. Flex-first workplace: hybrid schedule, generous learning stipend, and dedicated cloud credits for experimentation. Inclusive, data-driven culture: celebrate research publications, OSS contributions, and diverse perspectives while solving hard problems together. Skills: data,modern javascript,cloud,vector databases,angular,pipelines,ci,containerization,ml,aws,langchain,shell scripting,mlops,performance testing,knowledge-graph design (rdf/owl/sparql),feature engineering,ci/cd,python,aws services (sagemaker, bedrock, lambda),synthetic-data augmentation,generative ai,data-cataloging,metadata management,lineage,data governance

Posted 2 months ago

Apply

7.0 years

0 Lacs

India

Remote

Job Title: Data Architect – Ontology & Palantir Foundry Location: Remote Employment Type: Full-time About Us Our client is a leading AWS Premier Consulting Partner with over 400 tech professionals across the US, Canada, Dubai, and Asia. We specialize in delivering cutting-edge cloud, data, and AI/ML solutions to global enterprise clients. Job Description We are seeking experienced Data Architects with a strong background in Ontology Modeling and hands-on expertise in Palantir Foundry . The ideal candidates will play a key role in architecting scalable data solutions, developing semantic data models, and driving enterprise-wide data integration and governance initiatives. Key Responsibilities Design and implement ontology-driven data models to support complex analytical and operational use cases Architect and manage large-scale data pipelines and transformations within Palantir Foundry Collaborate with stakeholders to translate business needs into semantic models and data solutions Work closely with data engineering and data science teams to ensure data quality, consistency, and accessibility Lead efforts in metadata management, data lineage, and knowledge graph modeling Required Skills & Experience 7+ years of experience in data architecture, data modeling, or semantic technologies Strong expertise in Ontology modeling (OWL, RDF, SPARQL, SKOS, etc.) Proven hands-on experience with Palantir Foundry, including ontology management, pipelines, and code workbooks Familiarity with knowledge graphs, semantic web standards, and enterprise data governance Excellent communication and stakeholder management skills Preferred Qualifications Experience working with cloud-native data platforms (AWS preferred) Prior work in healthcare, finance, or government domains is a plus AWS certifications or equivalent cloud certifications are advantageous Why Join Us? Work with a global leader in AWS data transformation projects Flexible remote work setup Opportunity to work on high-impact, mission-critical projects with Fortune 500 clients Skills: cloud-native data platforms,data architecture,palantir foundry,ontology modeling,semantic technologies,knowledge graphs,data governance,data,palantir,ontology,data modeling

Posted 2 months ago

Apply

0.0 - 5.0 years

18 - 24 Lacs

Bengaluru, Karnataka

Remote

Job Description: Are you an experienced Graph Database Data Engineer with a passion for designing and building cutting-edge data solutions? We're looking for a skilled professional to join our remote team. While this is a remote role, candidates must reside in Bangalore, Hyderabad, or Pune. F2F mandate for the Final Discussion, About the Role: We're seeking a talented Graph DB Data Engineer to design, implement, and maintain our graph database solutions. You'll play a crucial role in building robust data pipelines, ensuring data quality, and optimizing query performance, all while leveraging modern cloud and containerization technologies. Required Skills & Experience: *6 – 9 years of hands-on experience as a Data Engineer, with significant focus on graph databases. *Deep understanding of Graph Database Architecture, Data Structures, and Operations. *Expertise in graph query languages: SPARQL and Gremlin. *Proven experience with AWS Neptune. *Strong proficiency in Python for data engineering tasks. *Hands-on experience with EKS and Kubernetes (K8s) for deploying and managing applications. *Familiarity with AWS services including S3 and Lambda. *Experience with Terraform for Infrastructure as Code. *Strong analytical and problem-solving skills. *Excellent communication and collaboration abilities. Job Types: Full-time, Permanent Pay: ₹1,800,000.00 - ₹2,400,000.00 per year Schedule: Day shift Monday to Friday Morning shift Experience: Graph Database (Architecture, Data Structures, Operations): 6 years (Required) Query Languages (SPARQL and Gremlin): 6 years (Required) AWS Neptune, S3, Lambda, Terraform: 5 years (Required) Python: 5 years (Preferred) Location: Bangalore, Karnataka (Required) Work Location: Remote

Posted 2 months ago

Apply

0 years

0 Lacs

Greater Kolkata Area

Remote

Key Responsibilities Knowledge Graph Development : Design and develop scalable knowledge graph architectures using RDF and OWL standards. SPARQL Query Design & Optimization : Write complex SPARQL queries to extract and manipulate data from triplestores or graph databases. Python-based Application Development : Build modular, efficient, and testable Python applications that interact with semantic data and APIs. Data Integration : Integrate diverse data sources (structured, unstructured, linked data) into graph databases. Ontology Engineering Support : Collaborate with ontologists and domain experts to extend and refine existing data ontologies. System Integration & Testing : Ensure smooth integration of developed modules with other enterprise services and perform comprehensive unit and integration Technical Skills : Programming Language : Advanced proficiency in Python Semantic Web & Graph Technologies : Solid understanding of Knowledge Graphs, RDF (Resource Description Framework) Strong hands-on experience with SPARQL Familiarity with OWL, SHACL, and Linked Data principles Graph Databases : Experience with graph databases such as Apache Jena, GraphDB, Blazegraph, Virtuoso, Stardog, or Neo4j (with RDF plugins) API Integration : Experience building or consuming RESTful APIs Familiarity with JSON-LD, Turtle, RDF/XML formats Version Control : Proficiency in Git, GitHub/GitLab workflows Testing & Debugging : Strong debugging skills, unit testing, and experience with testing frameworks in Skills : Familiarity with Natural Language Processing (NLP) concepts and libraries Experience with ETL tools, data pipelines, or data ingestion frameworks Exposure to Docker, Kubernetes, or other cloud-native environments Experience with CI/CD Skills & Qualities : Excellent analytical and problem-solving skills Strong communication and collaboration abilities in a remote team setting Attention to detail and a passion for writing clean, maintainable code Ability to manage time efficiently and work : Bachelors or Masters degree in Computer Science, Information Technology, or related field (ref:hirist.tech)

Posted 2 months ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Were building the technological foundation for our companys Semantic Layera common data language powered by Anzo / Altair Graph Studio. As a Senior Software Engineer, youll play a critical role in setting up and managing this platform on AWS EKS, enabling scalable, secure, and governed access to knowledge graphs, parallel processing engines, and ontologies across multiple domains, including highly sensitive ones like clinical trials. Youll help design and implement a multi-tenant, cost-aware, access-controlled infrastructure that supports internal data product teams in securely building and using connected knowledge graphs. Key Responsibilities Implement a Semantic Layer for on Anzo / Altair Graph Studio and Anzo Graph Lakehouse in a Kubernetes or ECS environment (EKS / ECS) Develop and manage Infrastructure as Code using Terraform and configuration management via Ansible Integrate platform authentication and authorization with Microsoft Entra ID (Azure AD) Design and implement multi-tenant infrastructure patterns that ensure domain-level isolation and secure data access Build mechanisms for cost attribution and usage visibility per domain and use case team Implement fine-grained access control, data governance, and monitoring for domains with varying sensitivity (e.g., clinical trials) Automate deployment pipelines and environment provisioning for dev, test, and production environments Collaborate with platform architects, domain engineers, and data governance teams to curate and standardize ontologies Minimum Requirements 4 - 9 years of experience in Software / Platform Engineering, DevOps, or Cloud Infrastructure roles Proficiency in Python for automation, tooling, or API integration Hands-on experience with AWS EKS / ECS and associated services (IAM, S3, CloudWatch, etc.) Strong skills in Terraform / Ansible / IaC for infrastructure provisioning and configuration Familiarity with RBAC, OIDC, and Microsoft Entra ID integration for enterprise IAM Understanding of Kubernetes multi-tenancy and security best practices Experience building secure and scalable platforms supporting multiple teams or domains Preferred Qualifications Experience deploying or managing Anzo, Altair Graph Studio, or other knowledge graph / semantic layer tools Familiarity with RDF, SPARQL, or ontologies in an enterprise context Knowledge of data governance, metadata management, or compliance frameworks Exposure to cost management tools like AWS Cost Explorer / Kubecost or custom chargeback systems Why Join Us Be part of a cutting-edge initiative shaping enterprise-wide data access and semantics Work in a cross-functional, highly collaborative team focused on responsible innovation Influence the architecture and strategy of a foundational platform from the ground up

Posted 2 months ago

Apply

0 years

0 Lacs

India

Remote

Are you passionate about transforming complex data into powerful knowledge graphs? We're looking for a Graph DB Data Engineer to join our team and shape the future of intelligent data systems. What You’ll Do: Design and develop scalable graph database solutions , ideally using AWS Neptune Build and manage robust data pipelines to ingest and transform structured and unstructured data Write performant queries using SPARQL and Gremlin Leverage Python for automation, data processing, and integration tasks Work across AWS services including EKS, Kubernetes, Lambda, and S3 What We’re Looking For: Strong expertise in graph database architectures , structures, and query languages Solid experience building end-to-end data pipelines Proficiency in Python programming Familiarity with AWS cloud environment , especially Neptune, Lambda, S3, and EKS A problem-solving mindset and a passion for clean, scalable data engineering Work mode: Remote If you thrive in dynamic environments and love working with cutting-edge technologies, we want to hear from you. Interested or know someone great? Let’s connect! #GraphDB #DataEngineer #AWSNeptune #Python #KnowledgeGraphs #Hiring #DataJobs #GraphDatabase Show more Show less

Posted 2 months ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

P1,C3,STS Required Skills Hands-on experience of Linux 8.x operating system for 5 years at an advanced level. Experience with Service Oriented Architecture, Distributed Systems, Scripting such as Python and shell, Relational database (E.g., Sybase, DB2, SQL, Postgres) Hands-on experience of web servers (Apache / Nginx), Application Servers (Tomcat / JBoss) to include application integration, configuration, and troubleshooting. Hands-on experience Docker containers, Kubernetes and SaaS platform integration Exposure and experience messaging technology like Kafka Clear concept of load balancer, web proxies and storage platforms like NAS / SAN from an implementation perspective only. Familiar with basic security policies for secure hosting solutions, Kerberos and standard encryption methodologies including SSL and TLS. Prior experience managing large web-based n-tier applications in secure environments on cloud Strong knowledge SRE Principles with grasp over tools / approach to apply them Strong infrastructure knowledge in Linux / Unix admin, Storage, Networking and Web Technologies Experience in troubleshooting Application Issues and Managing Incidents Exposure to tools like Open Telemetry, Prometheus, Grafana, Splunk, Ansible Excellent verbal and written communication skills. Desired / Nice to have skills Exposure to Big Data platforms like Hadoop / Cloudera and ELK Stack Working knowledge of workflow orchestration tool like Airflow Familiarity with caching DB like Redis, NoSQL database and SPARQL Capacity planning and performance tuning exercise Identity management protocols like OIDC / OAuth, SAML, LDAP integration Cloud Application and respective infrastructure Knowledge is a plus. Working knowledge of GenAI, LLM models Experience in Cloud / Distributed computing technology or certification is a plus Experience 5 to 8 years in a similar role of hands-on application / middleware specialist. Prior experience of working in a global financial organization is an advantage Location The candidate will be based at Morgan Stanleys office in Mumbai. NFR Tech is looking to onboard an application support and SRE specialist for their Application and Data Engineering (ADE) group. ADE provides application engineering, tooling, automation and elevated production support services conforming to company security blueprints and focused on performance, reliability and scalability by understanding the technical requirement from application owners and business, participate in technical evaluation of vendors and vendor technologies, conduct proof of concept, packaging and deploying middleware products. Skills Linux Python/Shell Database-Sybase, DB2 Web Servers Show more Show less

Posted 3 months ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Company Description NXP Semiconductors enables secure connections and infrastructure for a smarter world, advancing solutions that make lives easier, better and safer. As the world leader in secure connectivity solutions for embedded applications, we are driving innovation in the secure connected vehicle, end-to-end security & privacy and smart connected solutions markets. Organization Description Do you feel challenged by being part of the IT department of NXP, the company with a mission of “Secure Connections for a Smarter World”? Do you perform best in a role representing IT in projects in a fast moving, international environment? Within R&D IT Solutions, the Product Creation Applications (PCA) department is responsible for providing and supporting the R&D design community globally with best-in-class applications and support. The applications are used by over 6,000 designers. Job Summary As a Graph Engineer, you will: Develop pipelines and code to support the ingress and egress of this data to and from the knowledge graphs. Perform basic and advanced graph querying and data modeling on the knowledge graphs that lie at the heart of the organization's Product Creation ecosystem. Maintain the (ETL) pipelines, code and Knowledge Graph to stay scalable, resilient and performant in line with customer’s requirements. Work in an international and Agile DevOps environment. This position offers an opportunity to work in a globally distributed team where you will get a unique opportunity of personal development in a multi-cultural environment. You will also get a challenging environment to develop expertise in the technologies useful in the industry. Primary Responsibilities Translate requirements of business functions into “Graph-Thinking”. Build and maintain graphs and related applications from data and information, using latest graph technologies to leverage high value use cases. Support and manage graph databases. Integrate graph data from various sources – internal and external. Extract data from various sources, including databases, APIs, and flat files. Load data into target systems, such as data warehouses and data lakes. Develop code to move data (ETL) from the enterprise platform applications into the enterprise knowledge graphs. Optimize ETL processes for performance and scalability. Collaborate with data engineers, data scientists and other stakeholders to model the graph environment to best represent the data coming from the multiple enterprise systems. Skills / Experience Semantic Web technologies: RDF RDFS, OWL, SHACL SPARQL JSON-LD, N-Triples/N-Quads, Turtle, RDF/XML, TriX API-led architectures REST, SOAP Microservices API Management Graph databases, such as Dydra, Amazon Neptune, Neo4J, Oracle Spatial & Graph is a plus Experience with other NoSQL databases, such as key-value databases and document-based databases (e.g. XML databases) is a plus Experience with relational databases Programming experience, preferably Java, JavaScript, Python, PL/SQL Experience with web technologies: HTML, CSS, XML, XSLT, XPath Experience with modelling languages such as UML Understanding of CI/CD automation, version control, build automation, testing frameworks, static code analysis, IT service management, artifact management, container management, and experience with related tools and platforms. Familiarity with Cloud computing concepts (e.g. in AWS and Azure). Education & Personal Skillsets A master’s or bachelor’s degree in the field of computer science, mathematics, electronics engineering or related discipline with at least 10 plus years of experience in a similar role Excellent problem-solving and analytical skills A growth mindset with a curiosity to learn and improve. Team player with strong interpersonal, written, and verbal communication skills. Business consulting and technical consulting skills. An entrepreneurial spirit and the ability to foster a positive and energized culture. You can demonstrate fluent communication skills in English (spoken and written). Experience working in Agile (Scrum knowledge appreciated) with a DevOps mindset. More information about NXP in India... Show more Show less

Posted 3 months ago

Apply

4.0 - 6.0 years

7 - 10 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will be part of Researchs Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. Basic Qualifications and Experience: Doctorate Degree OR Masters degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation Expertise: Good experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-Solving: Excellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 3 months ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Pune, Chennai, Bengaluru

Work from Office

Role & responsibilities Expertise in Graph Database : A deep understanding of Graph Database (Architecture, Structures, and Operations , query languages (such as SPARQL and Gremlin). Experience in AWS Neptune is preferred. Knowledge of Data Pipelines: Proficiency in designing and managing data pipelines is crucial for ensuring the efficient flow and transformation of data into the knowledge graph. High level of Proficiency in Python programming AWS services including EKS, K8s, S3, and Lambda Secondary Skills CI/CD , Kubernetes, Docker This is compulsory - Expertise in Graph Database and Python programming

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies