Home
Jobs

356 Neo4J Jobs - Page 14

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6 - 8 years

25 - 27 Lacs

Chennai

Work from Office

Naukri logo

Experience 6 - 8 years. Skills: A strong level of proficiency in python programming. Practical knowledge and working experience on Statistics and Operation Research methods. Practical knowledge and working experience in tools and frameworks like Flask,PySpark,Pytorch,tensorflow, keras, Databricks, OpenCV, Pillow/PIL, streamlit, d3js, dashplotly, neo4j. Hands on experience in Analytics/AI-ML AWS services like Sagemaker, Canvas, Bedrock. Good understanding of how to apply predictive and machine learning techniques like regression models, XGBoost, random forest, GBM, Neural Nets, SVM etc. Proficient with NLP techniques like RNN, LSTM and Attention based models and effectively handle readily available stanford, IBM, Azure, Open AI NLP models. Good understanding of SQL from a perspective of how to write efficient queries for pulling the data from database. Hands on experience on any version control tool (github, bitbucket). Experience of deploying ML models into production environment experience (MLOps) in any one of the cloud platforms like Azure and AWS. Understanding business needs / mapping it to the business processes. Hands on experience in agile project delivery. Good in conceptualizing and visualizing end to end business needs both at high level as well as detailed. Good in articulating the business needs. Good analytical and problem-solving skills. Good communication, listening and probing skills. Strong inter-personnel skills. Should collaborate with other team members and work as team. Job Description: Comprehend business issues and propose valuable business solutions. Design Factual or AI/profound learning models to address business issues. Design Statistical Models/ML/DL models and deploy them for production. Formulate what information is accessible from where and how to augment it. Develop innovative graphs for data comprehension using d3js, dashplotly and neo4j. Preferred Certification: (Good to have) AWS Specialty Certification in Data Analytics, Machine Learning

Posted 2 months ago

Apply

3 - 5 years

6 - 16 Lacs

Chennai

Work from Office

Naukri logo

Greetings from SwaaS! Senior AI/ML Developer (Lead Role) Location: Guindy, Chennai [Work from office] Experience: 3 - 5 years Tech Stack: Python, Node.js (Javascript), LangChain, LLama Index, OpenAI API, Perplexity.ai API, Neo4j, Docker, Kubernetes Roles & Responsibilities: Lead the integration of Perplexity.ai and ChatGPT APIs for real-time chatbot interactions. Design and implement intent extraction and NLP models for conversational AI. Design and Implement RAG (Retrieval-Augmented Generation) pipelines for context-aware applications Guide junior developers and ensure best practices in LLM application development. Set up Python API security, and performance monitoring. Requirements: Deep knowledge of NLP, LLMs (ChatGPT, GPT-4, Perplexity.ai, LangChain). Experience with Graph Databases (Neo4j) and RAG techniques. Familiarity with vector databases (Pinecone, Weaviate, FAISS). Strong expertise in Python (FastAPI, Flask) and/or Node.js (Express, NestJS). Good to have - experience with containerization (Docker, Kubernetes). Excellent problem-solving skills and experience leading a small AI/ML team.

Posted 2 months ago

Apply

6 - 9 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About Neo4j: Neo4j is the leader in Graph Database Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn . Our Vision: At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we re disrupting how organizations leverage their data to innovate and stay competitive. Job Overview: As a Regional Partner Manager, you will be crucial in developing and managing relationships with key strategic partners within your assigned region. You will be responsible for driving partner engagement, enhancing the partner ecosystem, and maximizing the revenue potential of partnerships. Your deep understanding of the Data and AI ecosystem and the regional market dynamics will be essential to success in this role. Responsibilities will include working with Cloud Service providers and systems integrators. The position will report to the Partner leader for the APJ Region. Key Responsibilities: Build and grow the partner ecosystem for the Indian market. Work very closely with the Hyperscalers and their ecosystem partners in growing our business. Identify, recruit, and onboard new partners; nurture existing partnerships to maximize their potential. Work with partner marketing, field marketing, Cloud Sales directors, and the regional sales leader to develop and execute a regional partner strategy to drive new revenue growth and expand partner relationships. Collaborate with internal sales, marketing, and product teams to create partner programs, co marketing initiatives, and sales enablement tools. Conduct regular business reviews with partners to assess performance, set goals, and develop action plans for improvement. Provide training and support to partners on product offerings, sales strategies, and market trends. Monitor market trends and competitor activities to inform partnership strategies and opportunities. Represent the company at industry events, conferences, and partner meetings to enhance visibility and build relationships. Track and report on partnership performance metrics, providing insights for continuous improvement. Lead contract negotiations and finalize partnership agreements. Collaborate with legal and finance teams to ensure compliance with legal and financial requirements. Maintain accurate documentation for auditing and compliance purposes. Qualifications: BA/BS required 8+ years of quota carrying experience in partner management, channel sales, or business development roles Proven experience working with hyperscalers, technology alliances (Snowflake Databricks will be a ++), and system integrators to exceed pipeline and ARR goals. Demonstrated ability to articulate the business value of complex enterprise technology A track record of overachievement and hitting sales targets Skilled in building business champions and running a complex sales process Previous Sales Methodology training (e.g. MEDDIC, SPIN, Challenger Sales) Familiarity with databases and AI technologies is desired. Driven and competitive: Possess a strong desire to be successful Skilled in managing time and resources; sound approach to qualifying opportunities Possess aptitude to learn quickly and establish credibility. High EQ and self aware Passionate about growing your career in the largest market in software (database) and developing and maintaining an in depth understanding of Neo4j products Strong understanding of the Data and AI market dynamics and partner ecosystems. Excellent communication, negotiation, and relationship building skills. Ability to work independently and drive results in a fast paced environment. Strong analytical and strategic thinking capabilities. Willingness to travel as required to meet with partners and attend events. What We Offer: Competitive salary and commission structure. Comprehensive benefits package, including health, dental, and retirement plans. Opportunities for professional development and career advancement. A collaborative and dynamic work culture. The chance to work with cutting edge technology and make a substantial impact. Why Join Neo4j Neo4j is, without question, the most popular graph database in the world. We have customers in every industry globally, and our products are a proven product/market fit. Joining our team is an opportunity to shape the future of data and analytics. Below are just a few exciting facts about Neo4j. Neo4j is one of the fastest scaling technology companies in this industry. It recently surpassed $200M in annual recurring revenue (ARR), doubling its ARR over the past three years. Raised the biggest funding round in database history ($325M Series F). Backed by world class investors like Eurazeo, GV (formerly Google Ventures), and Inovia Capital, Neo4j has raised over $600M in funding and is currently valued at over $2Bn. This puts Neo4j among the most well funded database companies in history. 84% of the Fortune 100 and 58% of the Fortune 500 use Neo4j. Examples include Boston Scientific , BT Group , Caterpillar , Cisco , Comcast , Department for Education UK , eBay , NBC News , Novo Nordisk , Worldline , and others . Co founder and CEO Emil Eifrem has built an amazing culture that prides itself on relationships, inclusiveness, innovation, and customer success. Countless industry awards . Massive enterprises and individual developers/data scientists love Neo4j. A strong sense of community and ecosystem is built around the platform. A recent Forrester Total Economic Impact Study cited Neo4j as delivering 417% ROI to customers. Neo4j was named as a Visionary in the 2023 Gartner Magic Quadrant for Cloud Database Management Systems among 19 other recognized global DBMS vendors. Neo4j was also ranked as a Strong Performer among 14 top vendors in The Forrester Wave : Vector Databases, Q3 2024. Research shows that members of underrepresented communities are less likely to apply for jobs when they don t meet all the qualifications. If this is part of the reason you hesitate to apply, we d encourage you to reconsider and give us the opportunity to review your application. At Neo4j, we are committed to building awareness and helping to improve these issues. One of our central objectives is to provide an inclusive, diverse, and equitable workplace for everyone to develop their potential and have a positive, career defining experience. We look forward to receiving your application. Neo4j Values: Neo4j is a Silicon Valley company with a Swedish soul. We foster collaboration and each of us is empowered to contribute and put our innovative stamp on projects. We hire candidates who reflect the following Neo4j core values: (we) [:VALUE] >(relationships) (we) [:FOCUS_ON] >(userSuccess) (we) [:THRIVE_IN] >(:Culture {type: [ Open , Inclusive ]}) (we) [:ASSUME] >(:Intent {direction: Positive }) (we) [:WELCOME] >(:Discussions {nature: IntellectuallyHonest }) (we) [:DELIVER_ON] >(ourCommitments) Neo4j is committed to protecting and respecting your privacy. Please read the privacy notice regarding Neo4js recruitment process to understand how we will handle the personal data that you provide. More information at www.neo4j.com .

Posted 2 months ago

Apply

7 - 11 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About this role: "Wells Fargo is seeking a Principal Engineer. We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow. In this role, you will: Act as an advisor to leadership to develop or influence applications, network, information security, database, operating systems, or web technologies for highly complex business and technical needs across multiple groups Lead the strategy and resolution of highly complex and unique challenges requiring in-depth evaluation across multiple areas or the enterprise, delivering solutions that are long-term, large-scale and require vision, creativity, innovation, advanced analytical and inductive thinking Translate advanced technology experience, an in-depth knowledge of the organizations tactical and strategic business objectives, the enterprise technological environment, the organization structure, and strategic technological opportunities and requirements into technical engineering solutions Provide vision, direction and expertise to leadership on implementing innovative and significant business solutions Maintain knowledge of industry best practices and new technologies and recommends innovations that enhance operations or provide a competitive advantage to the organization Strategically engage with all levels of professionals and managers across the enterprise and serve as an expert advisor to leadership Required Qualifications: 7+ years of Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ years of experience in Cloud Native Application 3+ years of experience in OCP 5+ years of User Interface (UI) experience such as, Angular or React 5+ years of event driven development, microservice architecture, API designs 5+ years of experience using development toolkit including Jenkins, GitHub, and Bitbucket 3+ years working with public cloud platforms (i.e. Google Cloud/Vertex AI, AWS and Azure) and hands on experience with Prompt Engineering, AI/ML frameworks (i.e. TensorFlow, Pytorch) Knowledge and understanding of relational and non-relational databases, such as PostgreSQL, SQL Server, MongoDB, Neo4J etc

Posted 2 months ago

Apply

6 - 9 years

8 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: We are looking for an experienced Lead Python Developer with a strong foundation in Python, AI services, and Kubernetes Mesh. The ideal candidate should possess excellent problem-solving skills, a deep understanding of software architecture, and the ability to lead a team of technical experts. The role involves staying abreast of technological advancements, mentoring team members, and ensuring the successful delivery of projects using emerging technologies. Overall Responsibilities: Lead the development and implementation of projects using emerging technologies. Mentor and guide team members to ensure the successful delivery of projects. Identify and evaluate new technology solutions to improve business processes. Collaborate with cross-functional teams to ensure alignment with the organization's overall strategy. Stay up-to-date with the latest technological advancements and industry trends. Technical Skills: Primary Skills: Expert in Python and its core fundamentals. Experience with AI services and Kubernetes Mesh. Proficient in Event-Driven Programming. Ability to create Knowledge Graphs and Concept Graphs. Experience with Neo4J. Strong technical knowledge of the software development lifecycle. Secondary Skills: Good understanding of software architecture and design patterns. Ability to lead and manage a team of technical experts. Experience: At least 6-9 years of experience in software development and leading technology projects. Proven track record of delivering projects using emerging technologies. Experience in mentoring and guiding junior team members. Experience working with cross-functional teams. Day-to-Day Activities: Manage the development and delivery of projects using emerging technologies. Provide technical guidance and mentorship to junior team members. Collaborate with cross-functional teams to ensure alignment with the organization's overall strategy. Evaluate and recommend new technology solutions to improve business processes. Stay up-to-date with the latest technological advancements and industry trends. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Relevant certifications in emerging technologies (preferred). Soft Skills: Strong communication and leadership skills. Ability to work well under pressure and meet tight deadlines. Excellent interpersonal and team-working skills. Ability to effectively communicate technical information to non-technical stakeholders. Passionate about technology and a desire to stay up-to-date with the latest advancements.

Posted 2 months ago

Apply

7 - 12 years

8 - 14 Lacs

Delhi NCR, Mumbai, Bengaluru

Work from Office

Naukri logo

Job Summary : We are seeking an experienced Neo4j Engineer with deep expertise in graph databases to join our team. The ideal candidate will design, develop, and deploy applications using Neo4j as the primary backend, while also working on the architecture of large-scale data environments. This role involves working with containerized microservices, leveraging AWS, and optimizing performance to deliver robust and scalable solutions. Key Responsibilities : - Neo4j Application Development - Design and develop applications that utilize Neo4j as the primary backend database. - Build and optimize graph database models for efficient querying and data representation. - Microservices Architecture - Develop and deploy containerized microservices using Java, Docker, and Kubernetes to enhance scalability and maintainability. - Contribute to the development of cloud-native applications with a focus on Python and Java. - AWS Deployment and Management - Utilize AWS services (e.g., EC2, ECS) to manage and deploy applications in the cloud, ensuring high availability and performance. - Implement best practices for secure, scalable, and resilient cloud environments. - Performance Optimization and Troubleshooting - Optimize Neo4j queries and configurations for handling large-scale data environments, ensuring efficiency and speed. - Monitor and troubleshoot Neo4j databases, performing migrations and ensuring data integrity across environments. - Data Architecture and Modeling - Contribute to the architecture and design of graph data models to support application needs. - Stay updated on best practices, tools, and advancements in graph database technology. - Cross-functional Collaboration - Collaborate with data scientists, engineers, and stakeholders to align Neo4j data models with application requirements. Required Skills and Experience : - 10+ years of experience in software engineering, with a strong focus on Neo4j and graph databases. - Expertise in Neo4j database design, data modeling, and graph querying. - Proficient in Java and Python programming for developing cloud-native applications. - Strong experience with containerization tools like Docker and orchestration platforms like Kubernetes. - Experience deploying and managing applications on AWS (EC2, ECS, RDS, etc.). - Demonstrated ability to optimize and troubleshoot Neo4j databases in large-scale environments. Preferred Qualifications : - Neo4j Certification is highly desirable. - Familiarity with CI/CD processes, automation tools, and DevOps best practices. - Knowledge of additional cloud platforms like GCP or Azure. Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 months ago

Apply

5 - 8 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities : - Design, develop, and maintain ontologies using Anzo, ensuring data consistency and integrity - Collaborate with data architects, engineers, and stakeholders to identify and prioritize ontology requirements - Develop and implement data governance policies and standards for ontology management - Perform data modeling, data mapping, and data transformation to integrate data from various sources - Ensure ontology alignment with industry standards and best practices - Troubleshoot ontology-related issues and perform quality assurance - Document and maintain technical documentation for ontologies and related processes Requirements : - Bachelor's degree in Computer Science, Information Systems, or related field - 3+ years of experience in Ontology development, data modeling, or related field - Strong understanding of knowledge graph technologies, data modeling, and semantic web standards -Ontology (Designing, Developing and Maintaining) - Experience with Anzo or similar ontology development platforms - Excellent analytical, problem-solving, and communication skills - Ability to work collaboratively in a fast-paced environment Experience on any of the below Graph's Databases • Anzo • Neo4j, • Azure Cosmos DB • Stardog • GraphDB Nice to Have: - Master's degree in Computer Science, Information Systems, or related field - Experience with data governance, data quality, and data integration - Familiarity with agile development methodologies - Certification in ontology development or related field

Posted 3 months ago

Apply

7 - 11 years

30 - 40 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Graph Architect We are seeking a highly skilled and experienced Graph Architect with 7+ years of overall experience in data architecture, including 4+ years of hands-on expertise in graph technologies like Neo4j, Amazon Neptune , and related tools. The ideal candidate will design and implement graph-based solutions, support pre-sales activities, drive proposal development, and lead teams while engaging with stakeholders. Hands on in Generative AI (GenAI) and Graph RAG (Retrieval Augmented Generation) is highly preferred, enabling the creation of intelligent and context-aware graph-driven solutions. Key Responsibilities : 1. Graph Data Modeling & Architecture Design and implement graph data models tailored for complex business use cases using Neo4j, Amazon Neptune, and other graph databases. Develop schema-less or hybrid graph solutions optimized for scalability and performance. Architect and manage graph pipelines that enable advanced AI-powered insights, including RAG workflows. 2. Pre-sales & Proposal Support Collaborate with sales and business development teams to understand customer challenges and propose graph-based solutions, incorporating GenAI and RAG approaches where relevant. Deliver impactful technical presentations, Proof of Concepts (PoCs), and prototypes showcasing graph-powered AI capabilities. Write detailed proposals, including architecture diagrams, AI integration strategies, and ROI-focused solutions. 3. GenAI & Graph RAG Enablement Integrate Generative AI models with graph databases to enable contextual retrieval and augmented insights. Design workflows where graph structures enhance LLM (Large Language Model) capabilities in RAG-based applications. Leverage knowledge graphs to improve accuracy, reasoning, and real-time contextual understanding in AI models. 4. Team Leadership & Stakeholder Management Lead and mentor technical teams, ensuring successful delivery of graph and AI-powered projects. Act as a liaison between technical teams and business stakeholders, ensuring project alignment with organizational goals. Oversee project timelines, budgets, and risks while ensuring optimal resource utilization. 5. Development & Integration Design and optimize graph queries using Cypher, Gremlin, or SPARQL for advanced analytics. Integrate graph databases with visualization platforms (e.g., Bloom, Tableau) and GenAI pipelines. Enhance system performance and scalability in graph-augmented applications. 6. Contextual Insights & Innovation Collaborate with data architects, analysts, and AI teams to uncover hidden patterns, insights, and contextual relationships in data. Stay abreast of emerging technologies in graph databases, GenAI, and RAG techniques to drive innovation and value creation. Required Skills & Qualifications: • Experience: 7+ years of overall experience in data architecture, with at least 4+ years of hands-on experience in graph database technologies like Neo4j and Amazon Neptune. Proven experience working on GenAI projects or Graph RAG pipelines. • Graph Database Expertise: Strong skills in graph modeling, schema design, and optimization using Neo4j, Amazon Neptune, or similar platforms. • AI Integration: Familiarity with integrating graph databases into Generative AI and RAG workflows. • Query Languages: Proficiency in Cypher, Gremlin, and SPARQL. • Pre-sales Experience: Demonstrated ability to deliver technical presentations, create PoCs, and write proposals. • Team Management: Strong experience leading teams and managing stakeholder relationships. • Soft Skills: Exceptional communication, problem-solving, and collaboration abilities. Preferred Qualifications: • Certification in Neo4j or equivalent graph database technologies. • Experience with cloud platforms (Azure, AWS, GCP) for deploying graph and AI solutions. • Hands-on knowledge of GenAI frameworks (e.g., OpenAI, Hugging Face) and RAG workflows. • Familiarity with knowledge graphs and their role in AI-enhanced applications. Why Join Us? • Be at the forefront of innovation by combining graph technology with AI-driven applications. • Work on cutting-edge projects that shape the future of intelligent and contextual business solutions. • Collaborate with a dynamic team and enjoy opportunities for continuous learning and career growth.

Posted 3 months ago

Apply

5 - 10 years

27 - 40 Lacs

Pune, Bengaluru, Hyderabad

Hybrid

Naukri logo

Job Title : Graph DB + (Anzo/Neo4J) + Data Modeling + Data Engineer Experience : 4 to 10 years Job Location : Pan India Notice Period : 0-60 Days Only Key Responsibilities : Design, develop, and maintain ontologies using Anzo/Neo4J , ensuring data consistency and integrity. Collaborate with data architects , engineers , and stakeholders to identify and prioritize ontology requirements. Develop and implement data governance policies and standards for effective ontology management. Perform data modeling , data mapping , and data transformation to integrate data from various sources. Ensure ontologies align with industry standards and best practices . Troubleshoot ontology-related issues and carry out quality assurance procedures. Document and maintain technical documentation for ontologies and related processes. Key Requirements : Bachelor's degree in Computer Science , Information Systems , or related field. 3+ years of experience in ontology development , data modeling , or a related field. Strong understanding of knowledge graph technologies , data modeling , and semantic web standards . Proven experience in ontology design , development , and maintenance . Hands-on experience with Anzo or similar ontology development platforms (Neo4J, Azure Cosmos DB, Stardog, GraphDB). Excellent analytical , problem-solving , and communication skills . Ability to work collaboratively in a fast-paced environment. Nice to Have : Master's degree in Computer Science , Information Systems , or related field. Experience with data governance , data quality , and data integration . Familiarity with agile development methodologies . Certification in ontology development or related field. Share Updated CVS on sandeep.a@talent21.in

Posted 3 months ago

Apply

7 - 11 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Skillset (Fullstack Java Engineer) Must haves - Java 8 or above, MySQL, React JS Frameworks / Libraries / Tools - Dropwizard, Springboot, Linux commands, Microservices, Kubernetes, Gradle Knowledge of MongoDB, Elasticsearch and Neo4j is a big plus Good to have - shell scripting, Python, React native Knowledge about Google cloud Platform, K8S, helm charts Proficient in microservices architecture, design patterns Is well conversant with Devops, CI-CD, helm charts, shell scripting K8S deployment. You are Responsible for Work with senior engineers, product team and architects on implementing capabilities Own complete module(s) and/or service(s) Apply best practices of SDLC (e.g. writing Unit tests, Sonar quality checks are passing, dead code removal , refactoring) Document flows, playbooks Guide and do design and PR reviews for team members Provide on-call support and provide RCAs wherever applicable Optimize resource utilization Look into what parts can be automated to reduce on-call load Document HLDs, LLDs.

Posted 3 months ago

Apply

2 - 4 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role We are seeking a highly motivated and curious AI Engineer to join our team at Art Garage, IISc. This role offers a unique opportunity to work on cutting-edge Enterprise AI applications for Industry 5.0 and business process automation. You will be working alongside faculty and researchers at IISc, pushing the boundaries of Knowledge Graphs and Semantic Digital Twin modeling for enterprises, in particular, manufacturing enterprises. Key Responsibilities Design, develop, and optimize Semantic Digital Twin (SDT) models for enterprise use cases, building on Knowledge Graph models. Experiment with GenAI techniques to interface the SDT models with LLM-based interfaces and applications. Work with Python or Go to build scalable and production-ready Knowledge Graph implementations on the graph database Neo4j. Rapidly prototype and iterate on Semantic Digital Twin models to improve accuracy and performance. Who You Are A self-driven problem solver with a passion for AI and emerging technologies. Comfortable working in an experimental, research-driven environment. Willing to take risks, learn from failures, and push the boundaries of what Knowledge Graphs and Semantic AI can do for enterprises. Requirements Strong programming skills in Python or Go. Experience with cloud AI services and APIs Experience with LLMs, GenAI models (GPT, LLaMA, etc.), and Agentic frameworks preferable Understanding of AI model fine-tuning, inference optimization, and deployment is a bonus. Prior experience with NLP, deep learning, or AI-based automation solutions is a plus. What s in It for You Work on leading-edge AI technology in an academic and industry collaborative environment. Direct exposure to top-tier AI researchers and faculty at IISc. A chance to explore, experiment, and innovate in an open and supportive ecosystem. Competitive remuneration along with the opportunity to impact real-world AI adoption. ARTPARK @ IISc (AI Robotics Technology Park) is a unique non-profit (section-8) organization promoted by the Indian Institute of Science (IISc) to foster innovations in AI Robotics by bringing together the best of the startup, industry, research, and government ecosystem. It is seed funded by the Department of Science Technology (DST), Govt. of India, under the National Mission on Interdisciplinary Cyber-Physical Systems (NM-ICPS) and the Govt. of Karnataka. ARTPARK @ IISc is driving advances in robotics, autonomous systems and AI through translational RD in areas of Intelligent Healthcare, Automation for Logistics and Skilling for the AI age. Our work spans open tools, standards, IP, technologies, databanks and path-breaking companies. The registered company name is I-Hub for Robotics and Autonomous Systems Innovation Foundation. Learn more: www.artpark.in

Posted 3 months ago

Apply

4 - 9 years

8 - 18 Lacs

Pune, Nagpur, Bengaluru

Work from Office

Naukri logo

Sr. Java Developer Key Responsibilities: Develop and maintain Java/J2EE applications using Spring Boot framework. Write efficient SQL queries and optimize database performance. Containerize applications using Docker for deployment and scalability. Collaborate with cross-functional teams to design, develop, and implement solutions. Troubleshoot and debug issues, ensuring smooth operation of applications. Stay updated with the latest technologies and industry trends. GraphDB like Neo4J exposure Requirements: Bachelor's degree in Computer Science or related field. 4-8 years of hands-on experience in Java/J2EE development. Proficiency in Spring Boot framework, Microservices and related technologies. Strong knowledge of SQL and database design principles. Experience with Docker or other containerization technologies. Excellent communication skills and ability to collaborate effectively with team members. Understanding of YAML configuration files. Telecom OSS knowledge is a plus. Please share your cv on 9274416061 (Whats app only)

Posted 3 months ago

Apply

4 - 8 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description: Senior Python Developer Role: Senior Python Developer Location : Bangalore Employment Type: Full-Time Experience Level: Senior 4+ years Job Overview: We re looking for a Senior Python Developer with strong experience in Flask and FastAPI, who s excited about working on projects involving Generative AI (GenAI). Youll be responsible for building scalable backend systems, integrating third-party APIs, and working extensively with PostgreSQL. Experience with Vector Databases is a plus. Familiarity with tools like LangChain, Langflow, or Airflow will also make you stand out. If you have some Node.js knowledge or have worked with Neo4J or other knowledge graph databases, even better! Key Responsibilities: Develop and maintain backend services using Python frameworks like Flask and FastAPI. Integrate and manage third-party APIs to enhance application functionality. Work with PostgreSQL for database management, ensuring optimal performance. Implement solutions involving Generative AI (GenAI) for innovative projects. Work with Vector Databases (experience with them is a plus). Use knowledge graph databases such as Neo4J for advanced data modeling. Optimize applications for scalability and performance. Collaborate with cross-functional teams including engineering, data science, and AI specialists. Use tools like LangChain, Langflow, and Airflow for orchestrating AI workflows and task automation. Required Qualifications: Python Expertise: Solid experience in Flask and FastAPI development. - API Integration: Strong experience with integrating third-party APIs. Database Management: Deep expertise in PostgreSQL is required. - Generative AI: Familiarity with Generative AI technologies and their application in real-world solutions. Preferred Qualifications: - Node.js: Familiarity with Node.js is an advantage. - Vector Databases: Experience with Vector Databases is a bonus. - LangChain Langflow: Experience with LangChain for AI workflows and Langflow is beneficial. Airflow : Experience with Airflow for task automation and orchestration. Knowledge Graphs: Hands-on experience with Neo4J or similar knowledge graph databases. Soft Skills: - Strong communication and collaboration abilities Problem-solving skills with an analytical and creative mindset. Adaptability to work with new technologies and learn on the go. Educational Requirements: - Bachelor s degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Posted 3 months ago

Apply

3 - 5 years

5 - 7 Lacs

Uttar Pradesh

Work from Office

Naukri logo

Skill Name:Graph Database Ontologist With Anzo/Neo4J Experience:4-8 yrs Job Location:Any Tech Mahindra Location Pan India As an Graph Database Ontologist you will play a crucial role in creating and maintaining ontologies that enable data integration, search, and analytics across various data sources. Your expertise will help us unlock insights and drive business decisions. Responsibilities: - Design, develop, and maintain ontologies using Anzo, ensuring data consistency and integrity - Collaborate with data architects, engineers, and stakeholders to identify and prioritize ontology requirements - Develop and implement data governance policies and standards for ontology management - Perform data modeling, data mapping, and data transformation to integrate data from various sources - Ensure ontology alignment with industry standards and best practices - Troubleshoot ontology-related issues and perform quality assurance - Document and maintain technical documentation for ontologies and related processes Requirements : - Bachelor's degree in Computer Science, Information Systems, or related field - 3+ years of experience in Ontology development, data modeling, or related field - Strong understanding of knowledge graph technologies, data modeling, and semantic web standards -Ontology (Designing, Developing and Maintaining) - Experience with Anzo or similar ontology development platforms - Excellent analytical, problem-solving, and communication skills - Ability to work collaboratively in a fast-paced environment Experience on any of the below Graph's Databases Anzo Neo4j, Azure Cosmos DB Stardog GraphDB Nice to Have: - Master's degree in Computer Science, Information Systems, or related field - Experience with data governance, data quality, and data integration - Familiarity with agile development methodologies - Certification in ontology development or related field

Posted 3 months ago

Apply

8 - 13 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

As a key member of our dynamic team, you will play a vital role in crafting exceptional software experiences. Your responsibilities will encompass the design and implementation of innovative features, fine-tuning and sustaining existing code for optimal performance, and guaranteeing top-notch quality through rigorous testing and debugging. Collaboration is at the heart of what we do, and you'll be working closely with fellow developers, designers, and product managers to ensure our software aligns seamlessly with user expectations. The role seeks good levels of personal organisation and the ability to work well within a distributed global team in a fast paced and exciting environment. You will be office based, working with senior software engineers who will help you integrate into the team, the department and wider IBM. You will be joining a development squad following Design Thinking and Agile principles where you are expected to collaboratively develop creative solutions. The work can be varied, flexibility to learn new technologies and skills is key as we look look to help grow your career within IBM. A positive attitude and a passion to succeed is essential in joining a high performing software development team at IBM. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total experience of 10+ years in SaaS Software Development Be involved and take ownership of the end-to-end delivery - from solution design, estimation, development, testing and deployment 8+ years experience developing, deploying, debugging of Java / J2EE applications 8+ years experience with developing using Springboot 7+ years experience with developing micro-services using REST or GraphQL APIs 7+ years experience with database systems including SQL or NoSQL data stores 6+ years Experience in development and design of CICD & other automation frameworks Experience on unit test and API & UI Automation test development and execution Ability to write high-performance, reusable code, including appropriate testing Uphold quality standards including reliability, efficiency, security, maintainability and usability by applying best practice processes, methodologies and tools Proven oral/written communication and organizational skills Ability to multi-task and re-prioritize under pressure Using Agile Development principles and practices Preferred technical and professional experience Experience cloud-native technologies including Docker containers, Kubernetes orchestration, or OpenShift Familiarity with Cassandra, Mongo, ElasticSearch, Flink and/or Kafka. Experience with technologies like ES, Neo4J , Git, Jenkins and build tools like Maven/Gradle is much preferred.

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Neo4j Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications based on business requirements. Troubleshoot and debug applications to ensure optimal performance. Implement best practices and coding standards to ensure high-quality deliverables. Provide technical guidance and support to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Neo4j. Strong understanding of database concepts and data modeling. Experience in designing and developing applications using Neo4j. Knowledge of Cypher query language. Familiarity with graph database technologies. Good To Have Skills:Experience with SQL and relational databases. Additional Information: The candidate should have a minimum of 3 years of experience in Neo4j. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

7 - 9 years

9 - 11 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

As a Senior Solutions Engineer , you will be a key driver of Neo4j s growth by articulating business value , shaping solutions that address customer needs, and accelerating adoption within enterprises. You will engage in deep technical and business discovery , support the sales process through value-based selling , and collaborate with key partners , including hyperscalers (AWS, Azure, GCP) and strategic ISVs to drive joint go-to-market (GTM) initiatives. Responsibilities: Sales Process & Customer Engagement: Lead technical discovery to understand customer challenges, pain points, and success criteria, ensuring solutions align with business objectives. Drive value-based selling , clearly articulating Neo4j s unique differentiators and ROI to both technical and business stakeholders. Own the pre-sales engagement , delivering impactful presentations, live demonstrations, and competitive positioning . Map customer requirements to relevant Neo4j product capabilities , using whiteboarding, workshops, and tailored solution architectures. Develop proof-of-value (PoV) demonstrations and prototypes that showcase tangible business outcomes. Work closely with the sales team to support deal qualification, opportunity progression, and conversion strategies. Ensure smooth handoff to Professional Services , providing knowledge transfer to consulting teams and industry partners for successful implementation. Partner & Ecosystem Collaboration: Work closely with key partners , including hyperscalers (AWS, Azure, GCP), ISVs, and system integrators , to drive joint solutions and co-sell initiatives . Develop and deliver enablement programs for Neo4j s strategic partners , ensuring they have the technical skills and sales acumen to position Neo4j effectively. Collaborate on GTM strategies , helping design solutions that leverage hyperscaler services (e.g., AWS Neptune, Azure Cosmos DB, Google BigQuery) in combination with Neo4j to meet customer needs. Engage with partner sales and technical teams to drive adoption, influence deal cycles, and execute joint marketing initiatives. Community & Thought Leadership: Support local marketing events , user groups, and Neo4j Community initiatives. Contribute to Neo4j s thought leadership , delivering webinars, writing technical blogs, and speaking at industry conferences. Relevant Skills and Experience: Deep expertise in value selling , with a strong understanding of the sales process and how to align solutions with customer business goals. Experience in technical discovery , consultative solutioning, and mapping customer needs to technology capabilities. Graph database expertise (Neo4j, OrientDB, JanusGraph, etc.), along with knowledge of relational/NoSQL databases, analytics, and data integration . Strong experience working with hyperscalers (AWS, Azure, GCP) , including understanding of their data services and GTM strategies. Proficiency in Java, Python, JavaScript, Docker, Kubernetes , and cloud infrastructure technologies. Knowledge of data visualization tools (D3.js, Sigma.js, Linkurious, etc.). Excellent presentation and communication skills , capable of engaging technical teams, C-level executives, and partner organizations. Ability to work in fast-paced, sales-driven environments , managing multiple customer and partner engagements. Industry experience in Financial Services, Retail, Government, Security, or Telecommunications is a plus. Qualifications: 7+ years of pre-sales, professional services, or customer-facing technical experience . Bachelor s or Master s degree in a relevant field. Experience working with partners, including hyperscalers, ISVs, and system integrators . Ability to work independently in a cross-functional, global organization . Willingness to travel as needed for customer and partner engagements.

Posted 3 months ago

Apply

8 - 10 years

25 - 30 Lacs

Chennai, Pune, Delhi

Work from Office

Naukri logo

The Role: Develop and execute a territory plan based on target agencies and applicable use cases, resulting in a pipeline of opportunities in the target market, that will help you achieve quarterly and annual sales metrics. Develop expert knowledge of Neo4j solutions and applicability in target market covering Government and Enterprise accounts Develop and present to customers a strong understanding of the benefits and advantages of graph technology. Execute sales cycles that employ Strategic Selling strategies and tactics. Build and present proposals for Neo4j solutions that involve Neo4j products and Services. Work with Pre-Sales Engineering resources to scope and deliver on customer needs. Land & Expand - Grow existing account base with a strategic customer first methodology Provide guidance, direction, and support to your assigned SDR in their efforts to support your pipeline development. Ensure the execution of strategies for assigned key accounts to drive plans to increase revenue potential and growth Collaborate with Field Marketing resources targeting programs to increase awareness at the existing customer base resulting in revenue growth. Maintain Neo4j Salesforce.com CRM system with accurate information about your pipeline, in accordance with Neo4j forecasting guidelines. Ideally, you should have: 8-10 years of consistent success meeting or exceeding sales objectives selling technical solutions and software products into Government and Enterprise accounts. Demonstrable experience executing enterprise complex sales strategies and tactics. Experience with the commercial open-source business model, selling subscriptions for on-premise deployments and/or hybrid on-prem/cloud deployments. Previous experience and thrive in a smaller, high growth software company, where you have leveraged dedicated SDR resources, Field Marketing resources, and Pre-Sales Engineering helping build the business. Strong conviction and approach to how and where graph solutions fit into the enterprise marketplace. Demonstrate attention to detail, ensuring accurate entry and management of lead data in our SalesForce.com CRM system. Be proficient with standard corporate productivity tools (e.g., Google Docs, MS-Office, Salesforce.com , Web-conferencing). Be a team player with the highest level of integrity

Posted 3 months ago

Apply

6 - 10 years

12 - 22 Lacs

Chennai, Pune, Coimbatore

Work from Office

Naukri logo

Job Description Python Developer (Senior Level - 6+ years experience) Saama Technologies Key Responsibilities: Develop scalable and secure APIs for the Graph RAG system. Collaborate with ML and Data Engineers to integrate APIs with machine learning models and data pipelines. Optimize API performance to ensure fast and reliable responses. Implement security best practices to protect sensitive data and prevent unauthorized access. Skills Required: Strong expertise in Python programming language. Experience with Python web frameworks like FastAPI or Flask. Deep understanding of RESTful and GraphQL API design principles. Proficiency in working with cloud platforms (AWS, GCP, or Azure). Hands-on experience with containerization (Docker) and orchestration (Kubernetes). Ability to integrate APIs with databases and knowledge graphs. Knowledge of API security standards and best practices. Additional Desired Skills: Experience with API testing and documentation tools. Familiarity with CI/CD pipelines for automated deployment and testing. Knowledge of graph databases and LLM-based systems.

Posted 3 months ago

Apply

8 - 11 years

16 - 31 Lacs

Chennai, Pune, Coimbatore

Hybrid

Naukri logo

Job Description Technical Delivery Manager Saama Technologies Responsibilities: Oversee the end-to-end development and delivery of the Graph RAG system. Manage project timelines, ensuring timely delivery and adherence to milestones. Establish and maintain strong communication with client technical leads, providing regular updates and addressing technical concerns. Offer technical leadership and expertise in Graph Databases (e.g., Neo4j) and LLM-based applications. Collaborate with the team on architectural decisions, ensuring solutions are scalable, robust, and aligned with client requirements. Mitigate technical risks and address challenges proactively. Qualifications: Proven experience in technical project management and delivery, ideally within the AI/ML or data science domain. Strong understanding of Graph Databases and LLM-based systems. Experience with cloud-based development and deployment (AWS, GCP, or Azure). Excellent communication and interpersonal skills, with the ability to bridge the gap between technical and non-technical stakeholders. Ability to work independently and lead a team in a fast-paced environment. Experience with Agile methodologies. Required Skills: Knowledge of Graph Databases (Neo4) - Experience with LLM-based systems - Proficiency in LangChain - API development and cloud deployment expertise - Experience managing engineering teams and Agile methodologies Desired Skills: Familiarity with LangChain and API development. Knowledge of MLOps and CI/CD practices.

Posted 3 months ago

Apply

3 - 8 years

12 - 22 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Project Description: Grid Dynamics aims building enterprise generative AI framework to deliver innovative, scalable and efficient AI-driven solutions across business functions.Due to constant scaling of digital capabilities the platform requires enhancements to incorporate cutting-edge generative AI features and meet emerging business demands. Platform should onboard brand new capabilities like Similarity Search (image,video and voice);Ontology and entity managment;voice and file mgmt [text to speech & vice-versa, metadata tagging, multi-media file support];Advanced RAG ; Multi-Modal capabilities Responsibilities: As an LLMOps Engineer, you will be responsible for providing expertise on overseeing the complete lifecycle management of large language models (LLM). This includes the development of strategies for deployment, continuous integration and delivery (CI/CD) processes, performance tuning, and ensuring high availability of our LLM services. You will collaborate closely with data scientists, AI/ML engineers, and IT teams to define and align LLM operations with business goals, ensuring a seamless and efficient operating model. In this role, you will: Define and disseminate LLMOps best practices. Evaluate and compare different LLMOps tools to incorporate the best practices. Stay updated on industry trends and advancements in LLM technologies and operational methodologies. Participate in architecture design/validation sessions for the Generative AI use cases with entities. Contribute to the development and expansion of GenAI use cases, including standard processes, framework, templates, libraries, and best practices around GenAI. Design, implement, and oversee the infrastructure required for the efficient operation of large language models in collaboration with client entities. Provide expertise and guidance to client entities in the development and scaling of GenAI use cases, including standard processes, framework, templates, libraries, and best practices around GenAI Serve as the expert and representative on LLMops Practices, including: (1) Developing and maintaining CI/CD pipelines for LLM deployment and updates. (2) Monitoring LLM performance, identifying and resolving bottlenecks, and implementing optimizations. (3) Ensuring the security of LLM operations through comprehensive risk assessments and the implementation of robust security measures. Collaborate with data and IT teams to facilitate data collection, preparation, and model training processes. Practical experience with training, tuning, utilizing LLMs/SLMs. Strong experience with GenAI/LLM frameworks and techniques, like guardrails, Langchain, etc. Knowledge of LLM security and observability principles. Experience of using Azure cloud services for ML Experience of using Azure cloud services for ML Min requirements: Programming languages: Python Public Cloud: Azure Frameworks: K8s, Terraform, Arize or any other ML/LLM observability tool Experience: Experience with public services like Open AI, Anthropic and similar, experience deploying open source LLMs will be a plus Tools: LangSmith/LangChain,guardrails Would be a plus: Knowledge of LLMOps best practices. Experience with monitoring/logging for production models (e.g. Prometheus, Grafana, ELK stack) We offer: Opportunity to work on bleeding-edge projects Work with a highly motivated and dedicated team Competitive salary Flexible schedule Benefits package - medical insurance, sports Corporate social events Professional development opportunities Well-equipped office

Posted 3 months ago

Apply

5 - 9 years

7 - 11 Lacs

Trivandrum, Kochi, Coimbatore

Work from Office

Naukri logo

Job Title - Senior Data Engineer (Graph DB specialist)+ Specialist + Global Song Management Level :9,Specialist Location:Kochi, Coimbatore Must have skills: Data Modeling Techniques and Methodologies Good to have skills:Proficiency in Python and PySpark programming. Job Summary :We are seeking a highly skilled Data Engineer with expertise in graph databases to join our dynamic team. The ideal candidate will have a strong background in data engineering, graph querying languages, and data modeling, with a keen interest in leveraging cutting-edge technologies like vector databases and LLMs to drive functional objectives. Your responsibilities will include: Design, implement, and maintain ETL pipelines to prepare data for graph-based structures. Develop and optimize graph database solutions using querying languages such as Cypher, SPARQL, or GQL. Neo4J DB experience is preferred. Build and maintain ontologies and knowledge graphs, ensuring efficient and scalable data modeling. Integrate vector databases and implement similarity search techniques, with a focus on Retrieval-Augmented Generation (RAG) methodologies and GraphRAG. Collaborate with data scientists and engineers to operationalize machine learning models and integrate with graph databases. Work with Large Language Models (LLMs) to achieve functional and business objectives. Ensure data quality, integrity, and security while delivering robust and scalable solutions. Communicate effectively with stakeholders to understand business requirements and deliver solutions that meet objectives. Roles & Responsibilities: Experience:At least 5 years of hands-on experience in data engineering. With 2 years of experience working with Graph DB. Programming: Querying:Advanced knowledge of Cypher, SPARQL, or GQL querying languages. ETL Processes:Expertise in designing and optimizing ETL processes for graph structures. Data Modeling:Strong skills in creating ontologies and knowledge graphs.Presenting data for Graph RAG based solutions Vector Databases:Understanding of similarity search techniques and RAG implementations. LLMs:Experience working with Large Language Models for functional objectives. Communication:Excellent verbal and written communication skills. Cloud Platforms:Experience with Azure analytics platforms, including Function Apps, Logic Apps, and Azure Data Lake Storage (ADLS). Graph Analytics:Familiarity with graph algorithms and analytics. Agile Methodology:Hands-on experience working in Agile teams and processes. Machine Learning:Understanding of machine learning models and their implementation. Professional & Technical Skills: Additional Information: (do not remove the hyperlink) Qualifications Experience: Minimum 5-10 year(s) of experience is required Educational Qualification: Any graduation / BE / B Tech

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose and Impact: The Sr. Generative AI Engineer will architect, design and develop new and existing GenAI solutions for the organization. As a Generative AI Engineer, you will be responsible for developing and implementing products using cutting-edge generative AI and RAG to solve complex problems and drive innovation across our organization. You will work closely with data scientists, software engineers, and product managers to design, build, and deploy AI-powered solutions that enhance our products and services in Cargill. You will bring order to ambiguous scenarios and apply in depth and broad knowledge of architectural, engineering and security practices to ensure your solutions are scalable, resilient and robust and will share knowledge on modern practices and technologies to the shared engineering community. Key Accountabilities: Apply software and AI engineering patterns and principles to design, develop, test, integrate, maintain and troubleshoot complex and varied Generative AI software solutions and incorporate security practices in newly developed and maintained applications. Collaborate with cross-functional teams to define AI project requirements and objectives, ensuring alignment with overall business goals. Conduct research to stay up-to-date with the latest advancements in generative AI, machine learning, and deep learning techniques and identify opportunities to integrate them into our products and services, optimizing existing generative AI models and RAG for improved performance, scalability, and efficiency, developing and maintaining pipelines and RAG solutions including data preprocessing, prompt engineering, benchmarking and fine-tuning. Develop clear and concise documentation, including technical specifications, user guides and presentations, to communicate complex AI concepts to both technical and non-technical stakeholders. Participate in the engineering community by maintaining and sharing relevant technical approaches and modern skills in AI. Contribute to the establishment of best practices and standards for generative AI development within the organization. Independently handle complex issues with minimal supervision, while escalating only the most complex issues to appropriate staff. Other duties as assigned Minimum Qualifications: Bachelors degree in a related field or equivalent experience Minimum of five years of related work experience You are proficient in Python and have experience with machine learning libraries and frameworks Have deep understanding of industry leading Foundation Model capabilities and its application. You are familiar with cloud-based Generative AI platforms and services Full stack software engineering experience to build products using Foundation Models Confirmed experience architecting applications, databases, services or integrations Keywords: Front-End Frameworks: React Core Web Technologies: HTML5, CSS3, JavaScript, TypeScript Back-End Frameworks / Libraries: Node.js (Express.js, NestJS) or Python (Flask, Django) or Java (Spring Boot) Databases: SQL (MySQL, PostgreSQL), NoSQL (MongoDB, Cassandra), Graph Databases (Neo4j) Cloud & DevOps: ,pipelines APIs: REST, GraphQL, gRPC Microservices Architecture Authentication & Authorization (OAuth, JWT) Generative AI key words: Generative AI Engineer Agentic Automation Self-Service AI Platform LLM (Large Language Models) Prompt Engineering Autonomous Agents AI Workflow Orchestration LangChain / AutoGPT Generative AI Infrastructure MLOps for Generative AI

Posted 3 months ago

Apply

5 - 8 years

18 - 20 Lacs

Pune

Work from Office

Naukri logo

5+ years of experience in a Technical Support Role p on Data based Software Product at least L3 level. Respond to customer inquiries and provide in-depth technical support. Candidate to work during EMEA time zone (2PM to 10 PM shift)

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Chennai

Work from Office

Naukri logo

As a Production Support Engineer at Chola MS General Insurance, one should be responsible for Supporting, maintaining, documenting, expanding, and optimizing our Data Lake, Data warehouse, Data pipeline, and Data products. Required Candidate profile Should have a min 6+ yrs in Data Engg. Data Analytics platform Conduct root-cause analysis as & when needed & propose a corrective action plan Follow established set of processes while handling issues

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies