Home
Jobs

356 Neo4J Jobs - Page 10

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

0 Lacs

India

On-site

Linkedin logo

PLEASE NOTE: THIS ROLE IS ONLY FOR CANDIDATES WITH 5 TO 7 YEARS OF EXPERIENCE About PharmSight PharmSight is a leading innovator in bio-pharma analytics, providing cutting-edge AI-powered solutions that transform product research, market intelligence, and healthcare decision-making. We are dedicated to improving patient outcomes and driving advancements in the pharmaceutical industry through the application of advanced artificial intelligence Why join PharmSight? Competitive Compensation: Best-in-class salary with structured career progression Flexible Work Environment: Option to work from anywhere, at any time Global Client Exposure: Collaborate with leading pharmaceutical companies on impactful projects Career Growth & Recognition: A flat hierarchy with ample opportunities for leadership and professional development Role Overview As an AI Developer/Engineer (LLM) at PharmSight, you will be at the forefront of designing, developing, and deploying generative AI applications using state-of-the-art large language models (LLMs). You will be instrumental in crafting innovative AI solutions that solve complex challenges in bio-pharma analytics, product research, and market intelligence, directly impacting our clients ability to make data-driven decisions. This role demands a unique combination of deep technical expertise, creative problem-solving, and a passion for advancing AI technologies within the healthcare and pharmaceutical domains Key Responsibilities Architect, implement, and optimize large language models (LLMs) such as GPT, LLaMA, and BERT, tailoring them to the specific needs of bio-pharma analytics, product research, and market intelligence Experiment with diverse model architectures, hyperparameters, and training methodologies to maximize performance for targeted healthcare and pharmaceutical applications Fine-tune pre-trained models to address domain-specific challenges, ensuring exceptional accuracy, relevance, and contextual understanding Design and refine prompts to optimize LLM performance in generating accurate, insightful, and actionable outputs Develop instruction-tuning pipelines that align model behavior with specific business objectives and user requirements Continuously iterate on prompt strategies to enhance model interpretability and mitigate the risk of hallucinations or irrelevant outputs Conduct rigorous evaluations of LLMs using industry-standard metrics such as perplexity, BLEU, ROUGE, and domain-specific accuracy scores Perform in-depth error analysis, bias detection, and fairness audits to ensure models meet the highest ethical and regulatory standards Benchmark model performance against industry best practices and competitor solutions to maintain a competitive edge and drive continuous improvement Deploy LLMs into production environments, ensuring scalability, reliability, and low-latency performance to meet the demands of real-world applications Optimize models for inference speed and resource efficiency through techniques like quantization, distillation, and pruning Implement robust monitoring systems to track model performance in real-time and deploy timely updates to address drift or degradation in output quality Collaborate closely with data engineers and analysts to seamlessly integrate LLM outputs into PharmSight’s analytics platforms Leverage graph databases (e.g., vector graphs, hybrid graphs) to enhance structured knowledge extraction from unstructured text Develop APIs and intuitive interfaces that facilitate seamless interaction between LLMs and other critical system components Remain at the forefront of LLM research, actively exploring advancements in areas such as few-shot learning, reinforcement learning from human feedback (RLHF), and multimodal models Prototype and rigorously test emerging techniques to enhance model capabilities and address novel challenges in the bio-pharma domain Contribute findings to open-source projects, publish research insights, and represent PharmSight in AI research communities Work collaboratively with cross-functional teams including data scientists, product managers, and domain experts, ensuring that LLM development is aligned with critical business goals Mentor junior developers and analysts, providing guidance on LLM techniques, coding best practices, and emerging trends in AI Requirements Educational Background: bachelor’s or master’s degree in computer science, Data Science, Artificial Intelligence, or a related field AI & ML Experience: 5-7 years of hands-on experience in AI/ML development, with a strong focus on large language models (LLMs) Expertise in Python and deep learning frameworks (e.g., TensorFlow, PyTorch) Solid understanding of prompt engineering, model optimization, and NLP techniques Healthcare/Pharma Knowledge: A solid understanding of healthcare data, bio-pharma industry dynamics, and regulatory requirements Analytical Mindset: Exceptional problem-solving skills with the ability to translate business needs into innovative AI-driven solutions Communication Skills: Excellent written and verbal communication skills, with the ability to collaborate effectively with cross-functional teams and explain complex AI concepts to non-technical stakeholders (Bonus Skill) Experience in MLOps (e.g., Docker, Kubernetes, CI/CD pipelines, model monitoring) (Bonus Skill) Proficiency in cloud platforms (AWS, Azure, or GCP) for scalable AI deployment (Bonus Skill) Experience with knowledge graph construction and multimodal data integration (Eg, Neo4j, Entity extraction, nodes extraction) Join Us PharmSight offers a competitive salary, comprehensive benefits package, and the opportunity to work on cutting-edge AI projects that are transforming the pharmaceutical industry. We are committed to fostering a collaborative and innovative work environment where you can grow your skills and make a real impact Interested? Send your CV/Resume to Careers@pharmsight.com , and we’ll get back to you soon! Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Sr. Fullstack Developer Experience: 4 - 8 Years Exp Salary : Competitive Preferred Notice Period: Within 30 Days Shift: 10:00AM to 7:00PM IST Opportunity Type: Onsite (Ahmedabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Python , Python Programming Attri (One of Uplers' Clients) is Looking for: Senior DevOps Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description About Attri Attri is an AI organization that helps businesses initiate and accelerate their AI efforts. We offer the industry’s first end-to-end enterprise machine learning platform, empowering teams to focus on ML development rather than infrastructure. From ideation to execution, our global team of AI experts supports organizations in building scalable, state-of-the-art ML solutions. Our mission is to redefine businesses by harnessing cutting-edge technology and a unique, value-driven approach. With team members across continents, we celebrate diversity, curiosity, and innovation. About The Role: We are a global team with our people spread out across different countries. We strive to build a diverse team of passionate people who believe in bringing change through their work. At Attri, we are seeking a talented Frontend Engineer to join our dynamic team. We are a cutting-edge company, and we're looking for an individual who is passionate, inquisitive, and a self-learner, to contribute to the success of our projects. Responsibilities: Modern Web Development: Proficiency in HTML5, CSS3, ES6+, Typescript, and Node.js, with a strong emphasis on staying up-to-date with the latest technologies. TypeScript: Hands on with Generics, Template Literals, Mapped Types, Conditional Types Flexible Approach: Based on problem at hand apply appropriate solution while considering all the risks Frontend React.js and Flux Architecture: Extensive experience in React.js and Flux Architecture, along with external state management to build robust and performant web applications. JS Event Loop: Understanding of event loop, criticality of not blocking main thread, cooperative scheduling in react. State Management: Hands on with more than one state management library Ecosystem: Ability to leverage the vast JS ecosystem and hands on with non-typical libraries. Backend SQL - Extensive hands on with Postgres with comfortable with json_agg, json_build_object, WITH CLAUSE, CTE, View/Materialized View, Transactions Redis - Hands-on with different data structures and usage. Architectural Patterns - Backend for Frontend, Background Workers, CQRS, Event Sourcing, Orchestration/Choreography, etc Transport Protocols, such as HTTP(S), SSE, and WS(S), to optimize data transfer and enhance application performance Serialization Protocols - JSON and at least one more protocol Authentication/Authorization - Comfortable with OAuth, JWT and other mechanisms for different use cases Comfortable with reading open source code of libraries in use and understanding of internals Able to fork the library to either improve, fix bug, or redesign Tooling: Knowledge of essential frontend tools like Prettier, ESLint, and Conventional Commit to maintain code quality and consistency. Dependency management and versioning Familiarity with CI/CD Testing: Utilize Jest/Vitest and React Testing Library for comprehensive testing of your code, ensuring high code quality and reliability. Collaboration: Collaborate closely with our design team to craft responsive and themable components for data-intensive applications, ensuring a seamless user experience. Programming Paradigms: Solid grasp of both Object-Oriented Programming and Functional Programming concepts to create clean and maintainable code. Design/Architectural Patterns: Identifying suitable design and architectural pattern to solve the problem at hand. Comfortable with tailoring the pattern to fit the problem optimally Modular and Reusable Code: Write modular, reusable, and testable code that enhances codebase maintainability. DSA: Basic understanding of DSA when required to optimize hot paths. Good To Have: Python: Django Rest Framework, Celery, Pandas/Numpy, Langchain, Ollama Storybook: Storybook to develop components in isolation, streamlining the UI design and development process. Charting and Visualization: Experience with charting and visualization libraries, especially ECharts by Apache, to create compelling data representations. Tailwind CSS: Understanding of Tailwind CSS for efficient and responsive UI development. NoSQL Stores - ElasticSearch, Neo4j, Cassandra, Qdrant, etc. Functional Reactive Programming RabbitMQ/Kafka Great To Have: Open Source Contribution: Experience in contributing to open-source projects (not limited to personal projects or forks) that showcases your commitment to the development community. Renderless/Headless React Components: Developing renderless or headless React components to provide flexible and reusable UI solutions. End-to-End Testing: Experience with Cypress or any other end-to-end (E2E) testing framework, ensuring the robustness and quality of the entire application. Deployment: Being target agnostic and understanding the nuances of application in operation. What You Bring: Bachelor's degree in Computer Science, Information Technology, or a related field. 5+ years of relevant experience in frontend web development, including proficiency in HTML5, CSS3, ES6+, Typescript, React.js, and related technologies. Solid understanding of Object-Oriented Programming, Functional Programming, SOLID principles, and Design Patterns. Proven experience in developing modular, reusable, and testable code. Prior work on data-intensive applications and collaboration with design teams to create responsive and themable components. Experience with testing frameworks like Jest/Vitest and React Testing Library. Benefits : Competitive Salary 💸 Support for continual learning (free books and online courses) 📚 Leveling Up Opportunities 🌱 Diverse team environment 🌍 How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Attri, an AI organization, leads the way in enterprise AI, offering advanced solutions and services driven by AI agents and powered by Foundation Models. Our comprehensive suite of AI-enabled tools drives business impact, enhances quality, mitigates risk, and also helps unlock growth opportunities. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 3 weeks ago

Apply

13.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes and planning. You’ll be at the forefront of helping organizations around the globe adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. . Job Description Senior Data Architect with experience in design, build, and optimization of complex data landscapes and legacy modernization projects. The ideal candidate will have deep expertise in database management, data modeling, cloud data solutions, and ETL (Extract, Transform, Load) processes. This role requires a strong leader capable of guiding data teams and driving the design and implementation of scalable data architectures. Key areas of expertise include Design and implement scalable and efficient data architectures to support business needs. Develop data models (conceptual, logical, and physical) that align with organizational goals. Lead the database design and optimization efforts for structured and unstructured data. Establish ETL pipelines and data integration strategies for seamless data flow. Define data governance policies, including data quality, security, privacy, and compliance. Work closely with engineering, analytics, and business teams to understand requirements and deliver data solutions. Oversee cloud-based data solutions (AWS, Azure, GCP) and modern data warehouses (Snowflake, BigQuery, Redshift). Ensure high availability, disaster recovery, and backup strategies for critical databases. Evaluate and implement emerging data technologies, tools, and frameworks to improve efficiency. Conduct data audits, performance tuning, and troubleshooting to maintain optimal performance Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 13+ years of experience in data modeling, including conceptual, logical, and physical data design. 5 – 8 years of experience in cloud data lake platforms such as AWS Lake Formation, Delta Lake, Snowflake or Google Big Query. Proven experience with NoSQL databases and data modeling techniques for non-relational data. Experience with data warehousing concepts, ETL/ELT processes, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience delivering complex, multi-module projects in diverse technology ecosystems. Strong understanding of data governance, data security, and compliance best practices. Proficiency with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). Excellent leadership and communication skills, with a proven ability to manage teams and collaborate with stakeholders. Preferred Skills Experience with modern data architectures, such as data fabric or data mesh. Knowledge of graph databases and modeling for technologies like Neo4j. Proficiency with programming languages like Python, Scala, or Java. Understanding of CI/CD pipelines and DevOps practices in data engineering. Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Graph Data Engineer Experience: 5–7 Years Location: Remote Job Summary: We are hiring a Graph Data Engineer to design and maintain data pipelines using Neo4j and Azure services. This role requires strong skills in SQL , PySpark , and Cypher , along with experience in Graph Data Science (GDS) and cloud-based data engineering tools . The ideal candidate will replace a previous team member and must be capable of delivering end-to-end data solutions with graph integration. Key Requirements: Hands-on experience with Neo4j and Cypher queries Strong in SQL and PySpark Experience with Graph Data Science (GDS) Library Practical knowledge of Azure Data Factory and Azure Databricks Ability to build and manage end-to-end data pipelines Experience in implementing graph-based solutions for business use cases Neo4j performance tuning and process optimization Familiarity with other GraphDB tools or ML model integration - Nice to have Show more Show less

Posted 3 weeks ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Deliver engaging and interactive training sessions (24 hours total) based on structured modules. Teach integration of monitoring, logging, and observability tools with machine learning. Guide learners in real-time anomaly detection, incident management, root cause analysis, and predictive scaling. Support learners in deploying tools like Prometheus, Grafana, OpenTelemetry, Neo4j, Falco, and KEDA. Conduct hands-on labs using LangChain, Ollama, Prophet, and other AI/ML frameworks. Help participants set up smart workflows for alert classification and routing using open-source stacks. Prepare learners to handle security, threat detection, and runtime anomaly classification using LLMs. Provide post-training support and mentorship when necessary.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Neo4j Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we’re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it’s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j’s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn. Our Vision At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we’re disrupting how organizations leverage their data to innovate and stay competitive. The Role Develop and execute a territory plan based on target agencies and applicable use cases, resulting in a pipeline of opportunities in the target market, that will help you achieve quarterly and annual sales metrics. Develop expert knowledge of Neo4j solutions and applicability in target market covering Government and Enterprise accounts Develop and present to customers a strong understanding of the benefits and advantages of graph technology. Execute sales cycles that employ Strategic Selling strategies and tactics. Build and present proposals for Neo4j solutions that involve Neo4j products and Services. Work with Pre-Sales Engineering resources to scope and deliver on customer needs. “Land & Expand” - Grow existing account base with a strategic customer first methodology Provide guidance, direction, and support to your assigned SDR in their efforts to support your pipeline development. Ensure the execution of strategies for assigned key accounts to drive plans to increase revenue potential and growth Collaborate with Field Marketing resources targeting programs to increase awareness at the existing customer base resulting in revenue growth. Maintain Neo4j Salesforce.com CRM system with accurate information about your pipeline, in accordance with Neo4j forecasting guidelines. Ideally, You Should Have 8-10 years of consistent success meeting or exceeding sales objectives selling technical solutions and software products into Government and Enterprise accounts. Demonstrable experience executing enterprise complex sales strategies and tactics. Experience with the commercial open-source business model, selling subscriptions for on-premise deployments and/or hybrid on-prem/cloud deployments. Previous experience and thrive in a smaller, high growth software company, where you have leveraged dedicated SDR resources, Field Marketing resources, and Pre-Sales Engineering helping build the business. Strong conviction and approach to how and where graph solutions fit into the enterprise marketplace. Demonstrate attention to detail, ensuring accurate entry and management of lead data in our SalesForce.com CRM system. Be proficient with standard corporate productivity tools (e.g., Google Docs, MS-Office, Salesforce.com, Web-conferencing). Be a team player with the highest level of integrity Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Description Job Title: Lead I - Software Engineering Hiring Location: Mumbai/Chennai/Gurgaon Job Summary We are seeking a Lead I in Software Engineering with 4 to 7 years of experience in software development or software architecture. The ideal candidate will possess a strong background in Angular and Java, with the ability to lead a team and drive technical projects. A Bachelor's degree in Engineering or Computer Science, or equivalent experience, is required. Responsibilities Interact with technical personnel and team members to finalize requirements. Write and review detailed specifications for the development of system components of moderate complexity. Collaborate with QA and development team members to translate product requirements into software designs. Implement development processes, coding best practices, and conduct code reviews. Operate in various development environments (Agile, Waterfall) while collaborating with key stakeholders. Resolve technical issues as necessary. Perform all other duties as assigned. Must-Have Skills Strong proficiency in Angular 1.X (70% Angular and 30% Java OR 50% Angular and 50% Java). Java/J2EE; Familiarity with Singleton and MVC design patterns. Strong proficiency in SQL and/or MySQL, including optimization techniques (at least MySQL). Experience using tools such as Eclipse, GIT, Postman, JIRA, and Confluence. Knowledge of test-driven development. Solid understanding of object-oriented programming. Good-to-Have Skills Expertise in Spring Boot, Microservices, and API development. Familiarity with OAuth2.0 patterns (experience with at least 2 patterns). Knowledge of Graph Databases (e.g., Neo4J, Apache Tinkerpop, Gremlin). Experience with Kafka messaging. Familiarity with Docker, Kubernetes, and cloud development. Experience with CI/CD tools like Jenkins and GitHub Actions. Knowledge of industry-wide technology trends and best practices. Experience Range 4 to 7 years of relevant experience in software development or software architecture. Education Bachelor’s degree in Engineering, Computer Science, or equivalent experience. Additional Information Strong communication skills, both oral and written. Ability to interface competently with internal and external technology resources. Advanced knowledge of software development methodologies (Agile, etc.). Experience in setting up and maintaining distributed applications in Unix/Linux environments. Ability to complete complex bug fixes and support production issues. Skills Angular 1.X,Java 11+,Sql Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Our team members are at the heart of everything we do. At Cencora, we are united in our responsibility to create healthier futures, and every person here is essential to us being able to deliver on that purpose. If you want to make a difference at the center of health, come join our innovative company and help us improve the lives of people and animals everywhere. Apply today! Job Details Primary Duties & Responsibilities Works with cross-functional stakeholders to finalize desired technical specifications and application design Codes, tests, debugs and documents complex programs, and enhances existing programs to ensure that data processing production systems continue to meet user requirements Develops and maintains application design, program specification documents, and proprietary web applications Contributes effectively as a member of the team; takes ownership of individual assignments and projects with moderate oversight Manages and updates issue-tracking system when gaps in code and documentation are discovered May design and develop software for external clients Works with project lead and internal stakeholders to formulate product and sprint backlog Develops detailed system design specifications to serve as a guide for system/program development Identifies and resolves system operating programs in order to provide continuous business operations Interacts with user management regarding project status and user requirements to promote an environment with improved productivity and satisfaction Provides technical leadership and training to Software Engineers I Assists in scheduling, determining manpower requirements, and estimating costs to project completion in order to meet user requirements. Develops new control applications from a set of specifications and tests new and modified control applications Provides remote support for field personnel as they install and troubleshoot new applications Provides on-site support for some scheduled installations and upgrades and end-user support, primarily concerning application issues Creates documentation for configurations and how to implement and test the applications Skills And Experience Full-stack developer Proficient in React, Vue.js, .Net and . ASP.net, including building and supporting APIs (.NET/C#) Implementation in UX design and development, including best practices, such as WCAG Must demonstrate proficiency in secure coding practices Proficiency in SQL is highly desired Familiarity with Sitecore is a plus Must be comfortable working with a global team of IT members, business stakeholders, contractors, and vendor partners Prior experience delivering software solutions for health, transportation, or other regulated industries is a plus Experience & Educational Requirements Bachelor’s Degree in Computer Science, Information Technology or any other related discipline or equivalent related experience. 3+ years of directly-related or relevant experience, preferably in software designing and development. Preferred Certifications Android Development Certification Microsoft Asp.Net Certification Microsoft Certified Engineer Application/Infrastructure/Enterprise Architect Training and Certification, e.g. TOGAF Certified Scrum Master SAFe Agile Certification DevOps Certifications like AWS Certified DevOps Engineer Skills & Knowledge Behavioral Skills: Critical Thinking Detail Oriented Interpersonal Communication Learning Agility Problem Solving Time Management Technical Skills API Design Cloud Computing Methodologies Integration Testing & Validation Programming/Coding Database Management Software Development Life Cycle (SDLC) Technical Documentation Web Application Infrastructure Web Development Frameworks Tools Knowledge Cloud Computing Tools like AWS, Azure, Google cloud Container Management and Orchestration Tools Big Data Frameworks like Hadoop Java Frameworks like JDBC, Spring, ORM Solutions, JPA, JEE, JMS, Gradle, Object Oriented Design Microsoft Office Suite NoSQL Database Platforms like MongoDB, BigTable, Redis, RavenDB Cassandra, HBase, Neo4j, and CouchDB Programming Languages like JavaScript, HTML/CSS, Python, SQL Operating Systems & Servers like Windows, Linux, Citrix, IBM, Oracle, SQL What Cencora offers Benefit offerings outside the US may vary by country and will be aligned to local market practice. The eligibility and effective date may differ for some benefits and for team members covered under collective bargaining agreements. Full time Affiliated Companies Affiliated Companies: AmerisourceBergen Services Corporation Equal Employment Opportunity Cencora is committed to providing equal employment opportunity without regard to race, color, religion, sex, sexual orientation, gender identity, genetic information, national origin, age, disability, veteran status or membership in any other class protected by federal, state or local law. The company’s continued success depends on the full and effective utilization of qualified individuals. Therefore, harassment is prohibited and all matters related to recruiting, training, compensation, benefits, promotions and transfers comply with equal opportunity principles and are non-discriminatory. Cencora is committed to providing reasonable accommodations to individuals with disabilities during the employment process which are consistent with legal requirements. If you wish to request an accommodation while seeking employment, please call 888.692.2272 or email hrsc@cencora.com. We will make accommodation determinations on a request-by-request basis. Messages and emails regarding anything other than accommodations requests will not be returned Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description ABOUT CLOUDBEES CloudBees provides the leading software delivery platform for enterprises, enabling them to continuously innovate, compete, and win in a world powered by the digital experience. Designed for the world's largest organizations with the most complex requirements, CloudBees enables software development organizations to deliver scalable, compliant, governed, and secure software from the code a developer writes to the people who use it. The platform connects with other best of breed tools, improves the developer experience, and enables organizations to bring digital innovation to life continuously, adapt quickly, and unlock business outcomes that create market leaders and disruptors. CloudBees was founded in 2010 and is backed by Goldman Sachs, Morgan Stanley,Bridgepoint Credit, HSBC, Golub Capital, Delta-v Capital, Matrix Partners, and Lightspeed Venture Partners. Visit www.cloudbees.com and follow us on Twitter, LinkedIn, and Facebook. WHAT YOU’LL DO! These are some of the tasks that you’ll be engaged on: Design, develop, and maintain automated test scripts using Playwright with TypeScript/JavaScript, as well as Selenium with Java, to ensure comprehensive test coverage across applications. Enhance the existing Playwright framework by implementing modular test design and optimizing performance, while also utilizing Cucumber for Behavior-Driven Development (BDD) scenarios. Execute functional, regression, integration, performance, and security testing of web applications, APIs and microservices. Collaborate in an Agile environment, participating in daily stand-ups, sprint planning, and retrospectives to ensure alignment on testing strategies and workflows. Troubleshoot and analyze test failures and defects using debugging tools and techniques, including logging and tracing within Playwright, Selenium, Postman, Grafana, etc. Document and report test results, defects, and issues using Jira and Confluence, ensuring clarity and traceability for all test activities. Implement page object models and reusable test components in both Playwright and Selenium to promote code reusability and maintainability. Integrate automated tests into CI/CD pipelines using Jenkins and GitHub Actions, ensuring seamless deployment and testing processes. Collaborate on Git for version control, managing branches and pull requests to maintain code quality and facilitate teamwork. Mentor and coach junior QA engineers on best practices for test automation, Playwright and Selenium usage, and CI/CD workflows. Research and evaluate new tools and technologies to enhance testing processes and coverage. WHAT DO YOU NEED TO SHINE IN THIS ROLE? Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent work experience. At least 5 years of experience in software testing, with at least 3 years of experience in test automation. Ability to write functional test, test plan and test strategies Ability to configure test environment and test data using automation tools Experience in creation of an automated regress / CI test suite using Cucumber with Playwright (Preferred) or Selenium and Rest APIs Proficient in one or more programming languages - Java, Javascript or Typescript. Experience in testing web applications, APIs, and microservices using various tools and frameworks such as Selenium, Cucumber etc. Experience in testing SAST/DAST tools (Preferred) Experience in working with cloud platforms such as AWS, Azure, GCP, etc. Experience in working with CI/CD tools such as Jenkins, GitLab, GitHub, etc. Experience in writing queries and working with databases such as MySQL, MongoDB, Neo4j, Cassandra etc. Experience in working with tools such as Postman, JMeter, Grafana, etc. Exposure to Security standards and Compliance Experience in working with Agile methodologies such as Scrum, Kanban, etc. Ability to work independently and as part of a team. Ability to learn new technologies and tools quickly and adapt to changing requirements. Highly analytical mindset, logical approach to find solutions and perform root cause analysis Able to prioritize between critical and non critical path items Excellent communication skills with ability to communicate test results to stakeholders in the functional aspect of the system and its impact. What You’ll Get Highly competitive compensation, benefits, and vacation package Ability to work for one of the fastest growing companies with some of the most talented people in the industry Team outings Fun, Hardworking, and Casual Environment Endless Growth Opportunities We have a culture of movers and shakers and are leading the way for everyone else with a vision to transform the industry. We are authentic in who we are. We believe in our abilities and strengths to change the world for the better. Being inclusive and working together is at the heart of everything we do. We are naturally curious. We ask the right questions, challenge what can be done differently and come up with intelligent solutions to the problems we find. If that’s you, get ready to bee impactful and join the hive. Scam Notice Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of CloudBees. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that CloudBees will never ask for any personal account information, such as cell phone, credit card details or bank account numbers, during the recruitment process. Additionally, CloudBees will never send you a check for any equipment prior to employment. All communication from our recruiters and hiring managers will come from official company email addresses (@cloudbees.com) or from Paylocity and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent CloudBees and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at tahelp@cloudbees.com. We take these matters very seriously and will work to ensure that any fraudulent activity is reported and dealt with appropriately. If you feel like you have been scammed in the US, please report it to the Federal Trade Commission at: https://reportfraud.ftc.gov/#/. In Europe, please contact the European Anti-Fraud Office at: https://anti-fraud.ec.europa.eu/olaf-and-you/report-fraud_en Signs of a Recruitment Scam Ensure there are no other domains before or after @cloudbees.com. For example: “name.dr.cloudbees.com” Check any documents for poor spelling and grammar – this is often a sign that fraudsters are at work. If they provide a generic email address such as @Yahoo or @Hotmail as a point of contact. You are asked for money, an “administration fee”, “security fee” or an “accreditation fee”. You are asked for cell phone account information. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description ABOUT CLOUDBEES CloudBees provides the leading software delivery platform for enterprises, enabling them to continuously innovate, compete, and win in a world powered by the digital experience. Designed for the world's largest organizations with the most complex requirements, CloudBees enables software development organizations to deliver scalable, compliant, governed, and secure software from the code a developer writes to the people who use it. The platform connects with other best of breed tools, improves the developer experience, and enables organizations to bring digital innovation to life continuously, adapt quickly, and unlock business outcomes that create market leaders and disruptors. CloudBees was founded in 2010 and is backed by Goldman Sachs, Morgan Stanley,Bridgepoint Credit, HSBC, Golub Capital, Delta-v Capital, Matrix Partners, and Lightspeed Venture Partners. Visit www.cloudbees.com and follow us on Twitter, LinkedIn, and Facebook. WHAT YOU’LL DO! These are some of the tasks that you’ll be engaged on: Conceptualize product features for compliance capability that will enable organizations to streamline their software development and delivery processes by providing the ‘Sec’ element in DevSecOps. This includes creating features like tools, plugins, and integration that enhance the capabilities of the CloudBees product suite. Work with the product manager to Understand business objective Align product vision and strategy with business objects Align the engineering team with product vision and strategy Work with product owners across capabilities to align with strategic and tactical product roadmap objectives across the board. Understand business risk management, regulatory and security compliance frameworks like SOC2, NIST, SOX, CIS, PCI DSS, other Have a customer centric focus and act as customer advocate. Own and drive the team's product backlog. Create - Based on customer needs, market research. Own - Be accountable for driving the prioritization and delivery of backlog. Manage - Keep the backlog up to date reflecting inputs from stakeholders. Prioritize - Ensure the team is always focussed on top priority items in the backlog. Drive - Set focussed and achievable goals for each sprint. Monitor - Continuously track the progress of the product through each stage of development. Feedback the product backlog features acceptability to the development team. Determine and approve the final deliverable meets stakeholder expectations Make the work visible, transparent, and clear to all Inform and involve internal stakeholders of priority changes, risks, and progress. Work with research and design to create best in class customer experiences. Collaborate with engineering to validate technical feasibility and effort estimates. Work with the team to refine and improve the development process. Drive the Sprint review to celebrate achievements. You would have previously worked with exposure to: Agile methodology Jira, Confluence, Git and other SDLC toolings System analyst or business analyst on projects in AWS, GCP, Azure, others Cloud and container technologies WHAT DO YOU NEED TO SHINE IN THIS ROLE? Bachelor’s or master’s degree in computer science or a related technical field 5+ years of experience working with Scrum and Agile software development methodologies. Working knowledge of software development lifecycle. Working knowledge and/or previous experience in security compliance and cyber security. Exposure to Vulnerability Triage and Remediation Experienced in coordinating work across multiple teams. Ability to empathize with end users on challenges they face and understanding user - product interaction. Excellent communication skills with ability to communicate test results to stakeholders in the functional aspect of the system and its impact. Experience in writing queries and working with databases such as MySQL, MongoDB, Neo4j, Cassandra etc. Experience in working with tools such as Postman, JMeter, Grafana, etc. Experience in working with Agile methodologies such as Scrum, Kanban, etc. Ability to work independently and as part of a team. What You’ll Get Highly competitive compensation, benefits, and vacation package Ability to work for one of the fastest growing companies with some of the most talented people in the industry Team outings Fun, Hardworking, and Casual Environment Endless Growth Opportunities We have a culture of movers and shakers and are leading the way for everyone else with a vision to transform the industry. We are authentic in who we are. We believe in our abilities and strengths to change the world for the better. Being inclusive and working together is at the heart of everything we do. We are naturally curious. We ask the right questions, challenge what can be done differently and come up with intelligent solutions to the problems we find. If that’s you, get ready to bee impactful and join the hive. Scam Notice Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of CloudBees. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that CloudBees will never ask for any personal account information, such as cell phone, credit card details or bank account numbers, during the recruitment process. Additionally, CloudBees will never send you a check for any equipment prior to employment. All communication from our recruiters and hiring managers will come from official company email addresses (@cloudbees.com) or from Paylocity and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent CloudBees and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at tahelp@cloudbees.com. We take these matters very seriously and will work to ensure that any fraudulent activity is reported and dealt with appropriately. If you feel like you have been scammed in the US, please report it to the Federal Trade Commission at: https://reportfraud.ftc.gov/#/. In Europe, please contact the European Anti-Fraud Office at: https://anti-fraud.ec.europa.eu/olaf-and-you/report-fraud_en Signs of a Recruitment Scam Ensure there are no other domains before or after @cloudbees.com. For example: “name.dr.cloudbees.com” Check any documents for poor spelling and grammar – this is often a sign that fraudsters are at work. If they provide a generic email address such as @Yahoo or @Hotmail as a point of contact. You are asked for money, an “administration fee”, “security fee” or an “accreditation fee”. You are asked for cell phone account information. Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

India

On-site

Linkedin logo

Experience : 6+years Preferred Qualifications: Bachelor’s degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills: Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We Are Looking For 2+ years of expertise in software development with one or more of the general programming languages (e.g., Python, Java, C/C++, Go). Experience in Python and Django is recommended. Deep understanding of how to build an application with optimized RESTful APIs. Knowledge of a web framework like Django or similar with ORM or multi-tier, multi-DB-based data-heavy web application development will help your profile stand out. Knowledge of Gen AI tools and technologies is a plus. Sound knowledge of SQL queries & DB like PostgreSQL(must) or MySQL. Working knowledge of NoSQL DBs (Elasticsearch, Mongo, Redis, etc.) is a plus. Knowledge of graph DB like Neo4j or AWS Neptune adds extra credits to your profile. Knowing queue-based messaging frameworks like Celery, RQ, Kafka, etc., and distributed system understanding will be advantageous. Understands a programming language's limitations to exploit the language behavior to the fullest potential. Understanding of accessibility and security compliances Ability to communicate complex technical concepts to both technical and non- technical audiences with ease Diversity in skills like version control tools, CI/CD, cloud basics, good debugging skills, and test-driven development will help your profile stand out. Skills:- Python, Java and SQL Show more Show less

Posted 3 weeks ago

Apply

3.0 - 20.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Experience: 3-20 Years Location: Bangalore, Chennai Experience in building Java application with Spring boot. Should be able to design, develop, test, and deploy high-quality, reusable, and maintainable code. Develop and maintain unit tests. Experience in RESTful APIs and microservices architecture. Possess excellent analytical and problem-solving skills to troubleshoot and debug application issues Experience with any one IDE (e.g., IntelliJ), version control systems (e.g., Git), build tools (e.g., Gradle) and unit testing frameworks. Knowledge on design patterns and principles ( e.g.., SOLID ) Ability to work independently and as part of a team. Excellent communication, collaboration, and problem-solving skills. Must have skills: DB : Oracle, MySQL, Mondo DB, neo4j experience in writing optimal query Good to have skills: Solid understanding of software development methodologies (e.g., Agile, Scrum). Experience with CI/CD pipelines and a good knowledge of DevOps practices Experience with open source libraries and software's – (e.g., Apache Camel, Kafka, Redis, EFK ) Experience with containerization technologies (e.g., Docker, Kubernetes). Disclaimer: EdgeVerve Systems does not engage with external manpower agencies or charge any fees from candidates for recruitment. If you encounter such scams, please report them immediately. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Machine Learning & Deep Learning – Strong understanding of LLM architectures, transformers, and fine-tuning techniques. MLOps & DevOps – Experience with CI/CD pipelines, model deployment, and monitoring. Vector Databases – Knowledge of storing and retrieving embeddings efficiently. Prompt Engineering – Ability to craft effective prompts for optimal model responses. Retrieval-Augmented Generation (RAG) – Implementing techniques to enhance LLM outputs with external knowledge. Cloud Platforms – Familiarity with AWS, Azure, or GCP for scalable deployments. Containerization & Orchestration – Using Docker and Kubernetes for model deployment. Observability & Monitoring – Tracking model performance, latency, and drift. Security & Ethics – Ensuring responsible AI practices and data privacy. Programming Skills – Strong proficiency in Python, SQL, and API development. Knowledge of Open-Source LLMs – Familiarity with models like LLaMA, Falcon, and Mistral. Fine-Tuning & Optimization – Experience with LoRA, quantization, and efficient training techniques. LLM Frameworks – Hands-on experience with Hugging Face, LangChain, or OpenAI APIs. Data Engineering – Understanding of ETL pipelines and data preprocessing. Microservices Architecture – Ability to design scalable AI-powered applications. Explainability & Interpretability – Techniques for understanding and debugging LLM outputs. Graph Databases – Knowledge of Neo4j or similar technologies for complex data relationships. Collaboration & Communication – Ability to work with cross-functional teams and explain technical concepts clearly. Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: LLM Ops. Experience3-5 Years.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Neo4j Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we’re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it’s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j’s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn. Our Vision At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we’re disrupting how organizations leverage their data to innovate and stay competitive. The Opportunity Neo4j is looking for a Director of Revenue Business Systems to build our India Systems Center of Excellence (CoE). Partnering with the global team, you will be responsible for overseeing a ground of Analysts, Admins, Developers, and others to deliver solutions for the business. You will be at the intersection of all departments, and at the intersection of business strategy and technology. Primary Responsibilities Provide subject matter expertise for all aspects of the business applications you manage. Analyze and report key metrics to demonstrate the business value of investments in these applications. Collaborate with stakeholders to establish and manage SLAs, ensuring customer expectations are met through multi-tier support and uptime monitoring. Lead and mentor teams by fostering a collaborative, stable, and cohesive work environment. Monitor industry and market trends to guide strategic decision-making and proactively address challenges. Develop and communicate a clear application roadmap that aligns with business functions and adapts to emerging requirements. Establish and enforce standards, methods, and procedures for inspecting, testing, and evaluating the precision and reliability of applications. Forge long-term strategic partnerships with departmental leaders by understanding their challenges and opportunities, aligning team strategy with company-wide objectives. Accountable for overall team performance and associated performance management Requirements For Success B.S. in Computer Science, Information Systems, or related fields 8-10+ years of IT experience, including demonstrated success in progressively broadening architecture and technology leadership Expertise with managing application development at scale (delivery) A good understanding of software development life cycle methodologies including Agile and Scrum (both) Experience with leading and coordinating cross-functional initiatives, conducting interviews and performing analyses to create business cases for projects Experience using Atlassian products (Confluence, Jira) for project management; strong project management skills Experience in budget planning for projects Experience using Salesforce Core, CPQ, Billing, Netsuite, Must have strong communication, presentation and public speaking skills, with the ability to interact effectively with co-workers and people who have strong opinions Must have strong leadership skills, and collaboration skills with the ability to lead large projects autonomously Exhibits a high level of initiative and integrity and is empathetic to the needs of individuals across the organization Strong problem solving and critical thinking skills Self-starter who’s comfortable with ambiguity, asking questions, being resourceful to resolve issues, and adept at shifting priorities Strong customer service skills with proven service mentality Strong in documenting business processes and communicating system requirements Show more Show less

Posted 3 weeks ago

Apply

7.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:DevOps Lead Experience7-10 Years Location:Bengaluru : Overall, 7-10 years of experience in IT In-depth knowledge of GCP services and resources to design, deploy, and manage cloud infrastructure efficiently. Certification is big plus. Proficiency in Java or Shell or Python scripting. Develop, maintain, and optimize Infrastructure as Code scripts and templates using tools like Terraform and Ansible, ensuring resource automation and consistency. Strong expertise in Kubernetes using Helm, HAProxy, and containerization technologies Manage and fine-tune databases, including Neo4j, MySQL, PostgreSQL, and Redis Cache Clusters, to ensure performance and data integrity. Skill in managing and optimizing Apache Kafka and RabbitMQ to facilitate efficient data processing and communication. Design and maintain Virtual Private Cloud (VPC) network architecture for secure and efficient data transmission. Implement and maintain monitoring tools such as Prometheus, Zipkin, Loki and Grafana. Utilize Helm charts and Kubernetes (K8s) manifests for containerized application management. Proficient with Git, Jenkins, and ArgoCD to set up and enhance CI and CD pipelines. Utilize Google Artifact Registry and Google Container Registry for artifact and container image management. Familiarity with CI/CD practices, version control and branching and DevOps methodologies. Strong understanding of cloud network design, security, and best practices. Strong Linux and Network debugging skills Primary Skills: - Strong Kubernetes GKE Clusters Grafana Prometheus Terraform and Ansible - good working knowledge Devops Why Join Us: Opportunity to work in a fast-paced and innovative environment. Collaborative team culture with continuous learning and growth opportunities

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

18.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Name of company: The Smart Fellowship by Workverse Join our mission: We are building an automation-proof workforce for Bharat. We are rooting for Team Humans by training graduates to think, solve, and communicate beyond what AI can do. We want smart fellows to thrive alongside AI and remain in control - instead of being replaced by it. What we do: Formerly known as X Billion Skills Lab (since 2017), The Smart Fellowship is a hybrid workplace simulation where learners master in-demand workplace skills and GenAI skills through role play. In our immersive & narrative based experience, learners "work" in imaginary companies and solve 50+ generalist workplace scenarios - to build a strong intellectual foundation for rapid growth in the real world of work. Till date we have worked with 50,000+ learners and are even a credit program in one of India's top private universities. The best part about this role: Get direct exposure to customer relationship building, HR strategy, and operations in a fast-growing startup Opportunity to work closely with leadership and see your ideas in action Contribute to building a future-ready, human-first workforce in the age of AI Location: Khar West, Mumbai (Work from office) P.S. We’re looking for someone who genuinely cares about the work we’re doing and sees themselves growing with us. If it’s the right fit on both sides, we’d love to offer a long term commitment with fast tracked career growth. Meet the founder Samyak Chakrabarty has been featured by Forbes as one of Asia's most influential young entrepreneurs and has founded several social impact initiatives that have won national and international recognition. He has over 18 years of entrepreneurial experience and is on a mission to empower humans to outsmart artificial intelligence at work. Till date his work has positively impacted 1,00,000+ youth across the nation. For more information please visit his linkedin profile . Your role: As an AI/ML Architect at Workverse, you'll play a key role in shaping intelligent agents that assist in enterprise decision-making, automate complex data science workflows, and integrate seamlessly into our simulation. These agents will shape Neuroda, World's first AI soft-skills coach and workplace mentor leveraging reasoning, tool use, and memory—while staying aligned with our focus on enhancing the soft-skills learning experience within our simulation environment. We're seeking a strong engineering generalist with deep experience in LLMs, agent frameworks, and production-scale systems. You’ve likely prototyped or shipped agent-based systems, pushed the boundaries of what LLMs can do, and are looking for a meaningful opportunity to build the future of human-first workforce in the age of AI Responsibilities: Lead the design and development of Enterprise AI Agents and Data Science Agent systems that combine reasoning, tool orchestration, and memory. Collaborate with product, research, and infrastructure teams to create scalable agent architectures tailored for enterprise users. Build agent capabilities for tasks like automated analysis, reporting, data wrangling, and domain-specific workflows across business verticals. Integrate real-time knowledge, enterprise APIs, RAG pipelines, and proprietary tools into agentic workflows. Work closely with the alignment and explainability teams to ensure agents remain safe, auditable, and transparent in their reasoning and output. Continuously evaluate and incorporate advances in GenAI (e.g., controllability, multi-modal models, memory layers) into the agent stack. Requirements: Demonstrated experience building with LLMs and Agentic frameworks (e.g., LangChain, LangFlow, Semantic Kernel, CrewAI, Haystack, ReAct, AutoGPT, etc Experience with productionizing AI/LLM workflows and integrating them into real-world applications or systems 2+ years of software engineering experience, ideally with some time in early-stage startups or AI-first environments. Strong Python skills and a solid understanding of full-stack backend architecture (APIs, cloud infrastructure(AWS), relational, non-relational, graph-database (SQL, NO-SQL, ArangoDB, Neo4j) (Bonus) Experience working on agent toolchains for data science, ML ops, game data science and game analytics. Think you’re the one? Apply: Double check if you are comfortable with the work-from-office requirement Share your CV with tanvi@workverse.in , along with a brief note about why you think this role was made for you! Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Neo4j Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we’re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it’s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j’s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn. Our Vision At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we’re disrupting how organizations leverage their data to innovate and stay competitive. The Role Develop and execute a territory plan based on target agencies and applicable use cases, resulting in a pipeline of opportunities in the target market, that will help you achieve quarterly and annual sales metrics. Develop expert knowledge of Neo4j solutions and applicability in target market covering BFSI(Banking, Financial Services & Insurance) and Enterprise accounts. Develop and present to customers a strong understanding of the benefits and advantages of graph technology. Execute sales cycles that employ Strategic Selling strategies and tactics. Build and present proposals for Neo4j solutions that involve Neo4j products and Services. Work with Pre-Sales Engineering resources to scope and deliver on customer needs. “Land & Expand” - Grow existing account base with a strategic customer first methodology Provide guidance, direction, and support to your assigned SDR in their efforts to support your pipeline development. Ensure the execution of strategies for assigned key accounts to drive plans to increase revenue potential and growth Collaborate with Field Marketing resources targeting programs to increase awareness at the existing customer base resulting in revenue growth. Maintain Neo4j Salesforce.com CRM system with accurate information about your pipeline, in accordance with Neo4j forecasting guidelines. Ideally, You Should Have 8-10 years of consistent success meeting or exceeding sales objectives selling technical solutions and software products into BFSI(Banking, Financial Services & Insurance) and Enterprise accounts. Demonstrable experience executing enterprise complex sales strategies and tactics. Experience with the commercial open-source business model, selling subscriptions for on-premise deployments and/or hybrid on-prem/cloud deployments. Previous experience and thrive in a smaller, high growth software company, where you have leveraged dedicated SDR resources, Field Marketing resources, and Pre-Sales Engineering helping build the business. Strong conviction and approach to how and where graph solutions fit into the enterprise marketplace. Demonstrate attention to detail, ensuring accurate entry and management of lead data in our SalesForce.com CRM system. Be proficient with standard corporate productivity tools (e.g., Google Docs, MS-Office, Salesforce.com, Web-conferencing). Be a team player with the highest level of integrity Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

TCS is having vacancy for skill JAVA API for Chennai, Mumbai, Pune, Hyderabad Exp req: 3 to 9 yrs Mode of Interview: virtual JD · Good Programming & Analytical Skills · Thorough knowledge on the java core, spring core & microservices. · Good understanding of CI/CD, Docker & Kubernetes. · Authorization & Authentication process · Neo4j experience will be plus · Java, API, Microservices (Minimum of 3 years) · Any Relationship DB (Oracle, DB2 etc) · Springboot · Experience is deployment in OpenShift · Troubleshooting on incidents reported on OS · Should be able to manage the container platform ecosystem as Installation, Upgrade, Patching and Monitoring · Must have knowledge on OpenShift capacity and availability management Show more Show less

Posted 3 weeks ago

Apply

Exploring Neo4j Jobs in India

Neo4j, a popular graph database management system, is seeing a growing demand in the job market in India. Companies are looking for professionals who are skilled in working with Neo4j to manage and analyze complex relationships in their data. If you are a job seeker interested in Neo4j roles, this article will provide you with valuable insights to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Delhi/NCR

Average Salary Range

The average salary range for Neo4j professionals in India varies based on experience levels. - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

In the Neo4j skill area, a typical career progression may look like: - Junior Developer - Developer - Senior Developer - Tech Lead

Related Skills

Apart from expertise in Neo4j, professionals in this field are often expected to have or develop skills in: - Cypher Query Language - Data modeling - Database management - Java or Python programming

Interview Questions

  • What is a graph database? (basic)
  • Explain the difference between a graph database and a traditional relational database. (basic)
  • How does Neo4j handle relationships between nodes? (medium)
  • What is Cypher Query Language? (basic)
  • Can you give an example of a Cypher query to retrieve all nodes connected to a specific node? (medium)
  • How does Neo4j ensure data consistency in a distributed environment? (advanced)
  • What are the benefits of using Neo4j for social network analysis? (medium)
  • Explain the concept of indexing in Neo4j. (medium)
  • How does Neo4j handle transactions? (medium)
  • Can you explain the concept of graph traversal in Neo4j? (medium)
  • What are some common use cases for Neo4j in real-world applications? (medium)
  • How does Neo4j handle scalability? (advanced)
  • What is the significance of property graphs in Neo4j? (basic)
  • Explain the concept of cardinality in Neo4j. (medium)
  • How can you optimize Neo4j queries for better performance? (medium)
  • What are the key components of a Neo4j graph database? (basic)
  • How does Neo4j support ACID properties? (medium)
  • What are the limitations of Neo4j? (medium)
  • Can you explain the concept of graph algorithms in Neo4j? (medium)
  • How does Neo4j handle data import/export? (medium)
  • Explain the concept of labels and relationship types in Neo4j. (basic)
  • What are the different types of indexes supported by Neo4j? (medium)
  • How does Neo4j handle security and access control? (medium)
  • What are the advantages of using Neo4j over other graph databases? (medium)
  • How can you monitor and troubleshoot performance issues in Neo4j? (medium)

Conclusion

As you explore Neo4j job opportunities in India, it's essential to not only possess the necessary technical skills but also be prepared to showcase your expertise during interviews. Stay updated with the latest trends in Neo4j and continuously enhance your skills to stand out in the competitive job market. Prepare thoroughly, demonstrate your knowledge confidently, and land your dream Neo4j job in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies