Home
Jobs
Companies
Resume

26 Rdf Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

11 - 15 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Employment type: Freelance, Project Based What this is about: At e2f, we offer an array of remote opportunities to work on compelling projects aimed at enhancing AI capabilities As a significant team member, you will help shape the future of AI-driven solutions We value your skills and domain expertise, offering competitive compensation and flexible working arrangements We are looking for an experienced Data Analyst with a strong background in SPARQL for a project-based position The ideal candidate will be responsible for writing, reviewing, and optimizing the queries to extract valuable insights from our knowledge base Qualifications: Bachelor's degree in Computer Science, Data Science, or a related field Proven experience with SPARQL Familiarity with Cypher query languages Expertise in Knowledge Graphs Strong analytical and problem-solving skills Excellent communication and collaboration skills Ability to prioritize and manage workload efficiently Understanding of and adherence to project guidelines and policies Responsibilities: You can commit a minimum of 4 hours per day Flexible schedule (You can split your hours as you prefer) Participate in a training meeting Adhere to deadlines and guideline standards What We Offer: Engage in exciting generative AI development from the convenience of your home Enjoy flexible work hours and availability If you're interested: Apply to our job advertisement We'll review your profile and, if it aligns with our search, we will contact you as soon as possible to share rates and further details About Us: e2f is dedicated to facilitating natural communication between people and machines across languages and cultures With expertise in data science, we provide top-tier linguistic datasets for AI and NLP projects Know more here: www e2f com Show more Show less

Posted 1 week ago

Apply

1.0 - 4.0 years

2 - 6 Lacs

Mumbai, Pune, Chennai

Work from Office

Naukri logo

Graph data Engineer required for a complex Supplier Chain Project. Key required Skills Graph data modelling (Experience with graph data models (LPG, RDF) and graph language (Cypher), exposure to various graph data modelling techniques) Experience with neo4j Aura, Optimizing complex queries. Experience with GCP stacks like BigQuery, GCS, Dataproc. Experience in PySpark, SparkSQL is desirable. Experience in exposing Graph data to visualisation tools such as Neo Dash, Tableau and PowerBI The Expertise You Have: Bachelors or Masters Degree in a technology related field (e.g. Engineering, Computer Science, etc.). Demonstrable experience in implementing data solutions in Graph DB space. Hands-on experience with graph databases (Neo4j(Preferred), or any other). Experience Tuning Graph databases. Understanding of graph data model paradigms (LPG, RDF) and graph language, hands-on experience with Cypher is required. Solid understanding of graph data modelling, graph schema development, graph data design. Relational databases experience, hands-on SQL experience is required. Desirable (Optional) skills: Data ingestion technologies (ETL/ELT), Messaging/Streaming Technologies (GCP data fusion, Kinesis/Kafka), API and in-memory technologies. Understanding of developing highly scalable distributed systems using Open-source technologies. Experience in Supply Chain Data is desirable but not essential. Location: Pune, Mumbai, Chennai, Bangalore, Hyderabad

Posted 1 week ago

Apply

3.0 - 5.0 years

37 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

: Job TitleSenior Data Science Engineer Lead LocationBangalore, India Role Description We are seeking a seasoned Data Science Engineer to spearhead the development of intelligent, autonomous AI systems. The ideal candidate will have a robust background in agentic AI, LLMs, SLMs, Vector DB, and knowledge graphs. This role involves designing and deploying AI solutions that leverage Retrieval-Augmented Generation (RAG), multi-agent frameworks, and hybrid search techniques to enhance enterprise applications. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design & Develop Agentic AI ApplicationsUtilize frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG PipelinesIntegrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language ModelsCustomize LLMs and SLMs using domain-specific data to improve performance and relevance in specialized applications. NER ModelsTrain OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge GraphsConstruct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-FunctionallyWork with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimize AI WorkflowsEmploy MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring Your skills and experience 15+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs, and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j Hands-on experience with vector databases and embedding techniques. Familiarity with RAG architectures and hybrid search methodologies. Experience in developing AI solutions for specific industries such as healthcare, finance, or ecommerce. Strong problem-solving abilities and analytical thinking. Excellent communication skills for crossfunctional collaboration. Ability to work independently and manage multiple projects simultaneously How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 1 week ago

Apply

5.0 - 8.0 years

2 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1963_JOB Date Opened 17/05/2023 Industry Technology Job Type Work Experience 5-8 years Job Title Neo4j GraphDB Developer City Mumbai Province Maharashtra Country India Postal Code 400001 Number of Positions 5 Graph data Engineer required for a complex Supplier Chain Project. Key required Skills Graph data modelling (Experience with graph data models (LPG, RDF) and graph language (Cypher), exposure to various graph data modelling techniques) Experience with neo4j Aura, Optimizing complex queries. Experience with GCP stacks like BigQuery, GCS, Dataproc. Experience in PySpark, SparkSQL is desirable. Experience in exposing Graph data to visualisation tools such as Neo Dash, Tableau and PowerBI The Expertise You Have: Bachelors or Masters Degree in a technology related field (e.g. Engineering, Computer Science, etc.). Demonstrable experience in implementing data solutions in Graph DB space. Hands-on experience with graph databases (Neo4j(Preferred), or any other). Experience Tuning Graph databases. Understanding of graph data model paradigms (LPG, RDF) and graph language, hands-on experience with Cypher is required. Solid understanding of graph data modelling, graph schema development, graph data design. Relational databases experience, hands-on SQL experience is required. Desirable (Optional) skills: Data ingestion technologies (ETL/ELT), Messaging/Streaming Technologies (GCP data fusion, Kinesis/Kafka), API and in-memory technologies. Understanding of developing highly scalable distributed systems using Open-source technologies. Experience in Supply Chain Data is desirable but not essential. Location: Pune, Mumbai, Chennai, Bangalore, Hyderabad check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 week ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Gurugram

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do We are looking for experienced Knowledge Graph developers who have the following set of technical skillsets and experience. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements. Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments. Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management. Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management. Bring transparency in driving assigned tasks to completion and report accurate status. Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams. Assist senior team members, delivery leads in project management responsibilities. Build complex solutions using Programing languages, ETL service platform, etc. What you’ll bring Bachelor’s or master’s degree in computer science, Engineering, or a related field. 4+ years of professional experience in Knowledge Graph development in Neo4j or AWS Neptune or Anzo knowledge graph Database. 3+ years of experience in RDF ontologies, Data modelling & ontology development Strong expertise in python, pyspark, SQL Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Project management and task planning experience, ensuring smooth execution of deliverables and timelines. Strong communication and interpersonal skills to collaborate with both technical and non-technical teams. Experience with automation testing Performance OptimizationKnowledge of techniques to optimize knowledge graph operations like data inserts. Data ModelingProficiency in designing effective data models within Knowledge Graph, including relationships between tables and optimizing data for reporting. Motivation and willingness to learn new tools and technologies as per the team’s requirements. Additional Skills: Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations Experience in pharma or life sciences dataFamiliarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus. Experience in manufacturing data is a plus Capability to simplify complex concepts into easily understandable frameworks and presentations Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects Travel to other offices as required to collaborate with clients and internal project teams Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 1 week ago

Apply

4.0 - 6.0 years

15 - 25 Lacs

Pune

Work from Office

Naukri logo

Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyse data from graph databases. Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms. Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Qualifications: Strong proficiency in SparQL, and RDF query language, Python and Rest APIs. Experience with database technologies sql and sparql. Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

13 - 17 Lacs

Mumbai

Work from Office

Naukri logo

At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world's energy systems. Their spirit fuels our mission. Our culture is defined by caring, agile, respectful, and accountable individuals. We value excellence of any kind. Sounds like you? Software Developer - Data Integration Platform- Mumbai or Pune , Siemens Energy, Full Time Looking for challenging role? If you really want to make a difference - make it with us We make real what matters. About the role Technical Skills (Mandatory) Python (Data Ingestion Pipelines) Proficiency in building and maintaining data ingestion pipelines using Python. Blazegraph Experience with Blazegraph technology. Neptune Familiarity with Amazon Neptune, a fully managed graph database service. Knowledge Graph (RDF, Triple) Understanding of RDF (Resource Description Framework) and Triple stores for knowledge graph management. AWS Environment (S3) Experience working with AWS services, particularly S3 for storage solutions. GIT Proficiency in using Git for version control. Optional and good to have skills Azure DevOps (Optional)Experience with Azure DevOps for CI/CD pipelines and project management (optional but preferred). Metaphactory by Metaphacts (Very Optional)Familiarity with Metaphactory, a platform for knowledge graph management (very optional). LLM / Machine Learning ExperienceExperience with Large Language Models (LLM) and machine learning techniques. Big Data Solutions (Optional)Experience with big data solutions is a plus. SnapLogic / Alteryx / ETL Know-How (Optional)Familiarity with ETL tools like SnapLogic or Alteryx is optional but beneficial. We don't need superheroes, just super minds. A degree in Computer Science, Engineering, or a related field is preferred. Professional Software DevelopmentDemonstrated experience in professional software development practices. Years of Experience3-5 years of relevant experience in software development and related technologies. Soft Skills Strong problem-solving skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced and dynamic environment. Strong attention to detail and commitment to quality. Fluent in English (spoken and written) We've got quite a lot to offer. How about you? This role is based in Pune or Mumbai , where you'll get the chance to work with teams impacting entire cities, countries "“ and the shape of things to come. We're Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at: www.siemens.com/careers

Posted 3 weeks ago

Apply

3 - 5 years

6 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Position Summary: A Data Engineer designs and maintains scalable data pipelines and storage systems, with a focus on integrating and processing knowledge graph data for semantic insights. They enable efficient data flow, ensure data quality, and support analytics and machine learning by leveraging advanced graph-based technologies. How You"™ll Make an Impact (responsibilities of role) Build and optimize ETL/ELT pipelines for knowledge graphs and other data sources. Design and manage graph databases (e.g., Neo4j, AWS Neptune, ArangoDB). Develop semantic data models using RDF, OWL, and SPARQL. Integrate structured, semi-structured, and unstructured data into knowledge graphs. Ensure data quality, security, and compliance with governance standards. Collaborate with data scientists and architects to support graph-based analytics. What You Bring (required qualifications and skills) Bachelor"™s/master"™s in computer science, Data Science, or related fields. Experience3+ years of experience in data engineering, with knowledge graph expertise. Proficiency in Python, SQL, and graph query languages (SPARQL, Cypher). Experience with graph databases and frameworks (Neo4j, GraphQL, RDF). Knowledge of cloud platforms (AWS, Azure). Strong problem-solving and data modeling skills. Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders. The ability to work collaboratively in a dynamic team environment across the globe.

Posted 1 month ago

Apply

3 - 6 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Sr Semantic Engineer – Research Data and Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will be part of Research’s Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience: Doctorate Degree OR Master’s degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation ExpertiseGood experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-SolvingExcellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

2 - 4 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Associate Data Engineer Graph – Research Data and Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will be part of Research’s Semantic Graph. Team is seeking a dedicated and skilled Data Engineer to design, build and maintain solutions for scientific data that drive business decisions for Research. You will build scalable and high-performance, graph-based, data engineering solutions for large scientific datasets and collaborate with Research partners. The ideal candidate possesses experience in the pharmaceutical or biotech industry, demonstrates deep technical skills, has experience with semantic data modeling and graph databases, and understands data architecture and ETL processes. Roles & Responsibilities: Design, develop, and implement data pipelines, ETL/ELT processes, and data integration solutions Contribute to data pipeline projects from inception to deployment, manage scope, timelines, and risks Contribute to data models for biopharma scientific data, data dictionaries, and other documentation to ensure data accuracy and consistency Optimize large datasets for query performance Collaborate with global multi-functional teams including research scientists to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, Software Engineers and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Maintain documentation of processes, systems, and solutions What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience: Bachelor’s degree and 1to 3 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience OR Diploma and 4 to 7 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Hands on experience with big data technologies and platforms, such as Databricks, workflow orchestration, performance tuning on data processing. Excellent problem-solving skills and the ability to work with large, complex datasets Good-to-Have Skills: A passion for tackling complex challenges in drug discovery with technology and data Experience with system administration skills, such as managing Linux and Windows servers, configuring network infrastructure, and automating tasks with shell scripting. Examples include setting up and maintaining virtual machines, troubleshooting server issues, and ensuring data security through regular updates and backups. Solid understanding of data modeling, data warehousing, and data integration concepts Solid experience using RDBMS (e.g. Oracle, MySQL, SQL server, PostgreSQL) Knowledge of cloud data platforms (AWS preferred) Experience with data visualization tools (e.g. Dash, Plotly, Spotfire) Experience with diagramming and collaboration tools such as Miro, Lucidchart or similar tools for process mapping and brainstorming Experience writing and maintaining user documentation in Confluence Professional Certifications: Databricks Certified Data Engineer Professional preferred Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

2 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for S enior Ontology Engineer - Semantic Data Modelling . BUSINESS AREA: Siemens Energy business - Partner & driver of energy transition. Building innovative technologies, extensive energy experience and an ambitious strategy to decarbonize global energy systems are all central to our efforts to be the partner and driver of the energy transition. Technology advances every day. Information Technology is crucial to our future. As part of our IT R & D team, youll build extensive IT digital systems, develop plans for using cloud technologies, and create industrial data analytics platforms. Your work will benefit everyone at Siemens business entities. If you are determined to take us further with innovative IT solutions that keep our world connected and businesses moving forward, then theres a place for you at Siemens Energy. Join our Digitalization Technology and Services (DTS) team based in Bangalore. WHAT DIGITAL SOLUTIONS/PLATFORMS WE ARE BUILDING: Knowledge Graph Enabling Data Intelligent Applications using Semantic Data Modelling A data for Digital Twin and BIM. The knowledge graph represents a collection of interlinked descriptions of real-world entities like objects, events. The graph is processable by the machine as well as humans. JOB REQUIREMENTS/ SKILLS: Experience in Data Engineering and Ontology Expert in Data modelling, high level architecture designs and concepts with 2-7 years of experience Strong Python programming skills Good experience in ETL (Extract, Transform, Load) operations using Python and loading to graph DB. Expert in converting data into RDF. Expert in understanding of triple store database concepts and implementation in Graph DB (like Blazegraph) Supports new developments in Digitalization which are based on 3D, 4D and 5Ddata Expert in converting data from different formats like xml, excel etc into RDF (Resource Description Framework) Expert in various relational databases and NoSQL databases like MongoDB, semantics, triple stores etc Hands on working experience in Semantic Web standards from W3C: RDF, RDFS, OWL, SPARQL etc. Large scale data visualizations on the lines of d3js, Graphviz. Advance Level knowledge on data integration and complex data workflows Mandatory Skills: RDF, RDFS, OWL, SPARQL, Data Modeling Minimum 1 year of relevant (only hands on experience is countable, not just knowledge or awareness) experience on above mandatory technology stack Total experience should be about 2-7 years of experience Create a better #TomorrowWithUs! This role is in Bangalore, where youll get the chance to work with teams impacting entire cities, countries and the craft of things to come.

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Ahmedabad

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Mumbai

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Pune

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Noida

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Kolkata

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

8 - 12 years

12 - 17 Lacs

Gurgaon

Work from Office

Naukri logo

1. Hands-on in Oracle PL/SQL, AOL, RICE/CEMLI Components. 2. Hands-on in Oracle/BI Reports and Oracle Forms. 3. Oracle EBS knowledge either of Finance or SCM. 4. WebADI, OAF, or Workflow will be an advantage

Posted 2 months ago

Apply

3 - 5 years

9 - 13 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Neo4j Good to have skills : Database Management Minimum 3 year(s) of experience is required Educational Qualification : Minimum 15 years of full time education Roles & Responsibilities:1. Implementation of new Enhancement to existing UI2. Migration of Code to updated frameworks from unsupported and planned to be retired frameworks3. Refactor Code for Maintenance and stability 4. Migration to GraphDB 5. Bugfix of defects Professional & Technical Skills:1. 6+ years of IT experience2. Strong & proven development skills in RDF Triple Store:Apache JENA TBD & FUSEKI, Query (SPARQL, SPASQL), Ontotext GraphDB, Knowledge Graphs, Dictionaries/Ontologies (RDF, RDFS, OWL, SKOS, Schema.org), Logical Rules (SWRL, SPIN, R2RML, SHACL)3. Excellent analytical and problem-solving skills 4. Prior experience of working in Agile work environment5. Good to have experience on NoSQL based databases like MongoDB, Neo4j etc.,6. Strong communication skills Additional Information: The candidate should have a minimum of 3 years of experience in Neo4j The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Pune office. Qualification Minimum 15 years of full time education

Posted 2 months ago

Apply

6 - 11 years

10 - 20 Lacs

Chennai

Work from Office

Naukri logo

Role & responsibilities exp-7+ location-remote Position Overview Oracle retail with experience in MFP/IP with RDF and RDS modules. Kindly share new profiles asap. Kindly remember below pointers, Experience in Oracle RDF: Comprehensive understanding of Oracle RDF architecture and functionalities. Expertise in configuring, customizing, and optimizing RDF solutions to meet business requirements. Experience with RDF data modeling and demand forecasting algorithms. Proficiency in utilizing RDFs user interface for analyzing forecast data and reports. Ability to troubleshoot and resolve issues within RDF applications efficiently. Understanding of RDF batch processing and job scheduling. Oracle-Specific Knowledge: Extensive knowledge of Oracle databases, including SQL and PL/SQL programming skills. Proficiency in Oracle Retail Suite, including other modules that complement RDF, such as Oracle Retail Merchandising System (RMS) and Oracle Retail Price Management (RPM). Experience with Oracle Cloud Infrastructure (OCI) and understanding of Oracle Cloud services that enhance RDF capabilities. Familiarity with Oracle E-Business Suite or Oracle Fusion applications related to supply chain management. Knowledge of Oracle's middleware tools, such as Oracle GoldenGate and Oracle Data Integrator (ODI), for data integration and synchronization. System Integration: Proficient in designing and implementing integration solutions between Oracle RDF, Blue Yonder, and other enterprise systems. Familiarity with enterprise service buses (ESB), APIs, and data exchange formats (e.g., XML, JSON). Experience with tools for data extraction, transformation, and loading (ETL) to facilitate seamless data flow. Experience in Oracle RDS: Retail Data Storage Expertise: In-depth understanding of retail-specific data storage requirements and best practices. Experience in designing and managing data warehouses and data lakes tailored for retail applications. Proficiency in data modeling and schema design to support retail analytics and reporting. Oracle-Specific Knowledge: Extensive knowledge of Oracle databases, including performance tuning and optimization for large datasets. Experience with Oracle Exadata or Oracle Autonomous Data Warehouse as it pertains to retail environments. Familiarity with Oracle Retail Suite data storage practices and integration with other Oracle products. Data Integration and ETL: Expertise in designing and implementing ETL processes specific to retail data, using tools like Oracle Data Integrator (ODI) or Informatica. Knowledge of integrating various data sources, including point-of-sale (POS) systems, ERP systems, and online retail platforms. Experience with data migration and transformation to ensure data consistency and quality.

Posted 2 months ago

Apply

4 - 9 years

10 - 15 Lacs

Delhi NCR, Gurgaon

Work from Office

Naukri logo

EBS SCM Module (Cycle,Tables,Inventory-> Purchasing, GL,AP,AR, OM etc.)EBS D2K ,RDF & XML Reports PL/SQL Oracle Forms, Workflow Oracle EBS 12i,Techno Functional in Oracle EBS(R12) Good Communication skills Sound MS Office knowledge MANESHAR- GURGAON Required Candidate profile GURGAON - MANESHAR - DELHI NCR SALARY BUDGET IS ABOUT 15 LAKHS BUT NOT CONSTRANT FOR RIGHT CANDIDATE MCA OR B.TECH ( CS & iT) GIVEN PREFERENCE SEND RESUME OR CALL TO JOHRA @9398664031 Perks and benefits NEED URGENT JOINING EXP LESS THAN 6 YRS

Posted 3 months ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your primary responsibilities include: Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 5-8 years of experience with software development, At least 5 year(s) of working experience in the related field is required for this position Experience in Technical and Functional experience in Oracle Retail (RETEK), Oracle Retail Merchandising System (RMS), Oracle Retail Data Warehousing (RDW), Retail Predictive Application Server (RPAS), Retail Demand Forecasting (RDF), Retail Price Management (RPM), Retail Integration Bus (RIB), Retail Merchandise Financial Planning (MFP), Oracle Retail Allocations, Oracle Retail Optimization (RO), RA Merchandise (Oracle Retail Merchandising Analytics), RA Customer (Oracle Retail Customer Analytics), WMS. Experience in Oracle retail consultants provide extensive functional expertise across RPAS(Retail Predictive Application Server), SIM(Store Inventory Management), AIP(Advanced Inventory Planning), WMS etc. Have the ability to work around the onshore-offshore resourcing model Respond to technical queries / requests from team members and customers Preferred technical and professional experience Good experience in application of standard software development principles. Holding in-depth knowledge of at least one development technology/programming language. Understanding of design patterns

Posted 3 months ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your primary responsibilities include: Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 5-8 years of experience with software development, At least 5 year(s) of working experience in the related field is required for this position Experience in Technical and Functional experience in Oracle Retail (RETEK), Oracle Retail Merchandising System (RMS), Oracle Retail Data Warehousing (RDW), Retail Predictive Application Server (RPAS), Retail Demand Forecasting (RDF), Retail Price Management (RPM), Retail Integration Bus (RIB), Retail Merchandise Financial Planning (MFP), Oracle Retail Allocations, Oracle Retail Optimization (RO), RA Merchandise (Oracle Retail Merchandising Analytics), RA Customer (Oracle Retail Customer Analytics), WMS. Experience in Oracle retail consultants provide extensive functional expertise across RPAS(Retail Predictive Application Server), SIM(Store Inventory Management), AIP(Advanced Inventory Planning), WMS etc. Have the ability to work around the onshore-offshore resourcing model Respond to technical queries / requests from team members and customers Preferred technical and professional experience Good experience in application of standard software development principles. Holding in-depth knowledge of at least one development technology/programming language. Understanding of design patterns

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies