Jobs
Interviews

626 Neo4J Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key Attributes: Adaptability & Agility Thrive in a fast-paced, ever-evolving environment with shifting priorities. Demonstrated ability to quickly learn and integrate new technologies and frameworks. Strong problem-solving mindset with the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark. Define and implement smart data pipelines from RDBMS to Graph Databases . Build and expose APIs using AWS Lambda and ECS-based microservices . Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and scalable code following best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required. Ensure security, compliance, and observability across all development activities. All you need is... Required Skills & Experience Expert-level proficiency in Python with a strong grasp of Object oriented & functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms – AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You should have 3 to 5 years of experience in Java development with a focus on Spring Boot. Your role will involve working with Java+Springboot, Blue planet inventory, and Neo4J database. Your responsibilities will include: - Utilizing your strong knowledge of UI frameworks, especially Angular, to develop dynamic and interactive web applications. - Working with Kubernetes to manage microservices-based applications in a cloud environment. - Handling Postgres for relational data and Neo4j for graph database to manage complex data models. - Engaging in Meta Data Modeling and designing data structures that ensure high-performance and scalability. - Demonstrating expertise in Camunda BPMN and business process automation. - Implementing rules using Drools Rules Engine. - Utilizing Unix/Linux systems for application deployment and management. - Building data Ingestion Frameworks to process and manage large datasets effectively. Your experience and skills in these areas will be crucial for the successful execution of your responsibilities in this role.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a skilled, motivated, and quick-learning Full Stack Developer to join our team dedicated to cutting-edge Gen AI development work. As a Full Stack Developer, you will be responsible for creating innovative applications and solutions that encompass both frontend and backend technologies. While our solutions often involve the use of Retrieval Augmented Generation (RAG) and Agentic frameworks, your role will extend beyond these technologies to encompass a variety of AI tools and techniques. Your responsibilities will include developing and maintaining web applications using Angular, NDBX frameworks, and other modern technologies. You will design and implement databases in Postgres DB, employ ingestion and retrieval pipelines utilizing pgvector and neo4j, and ensure the implementation of efficient and secure data practices. Additionally, you will work with various generative AI models and frameworks such as LangChain, Haystack, and LlamIndex for tasks like chucking, embeddings, chat completions, and integration with different data sources. You will collaborate with team members to integrate GenAI capabilities into applications, write clean and efficient code adhering to company standards, conduct testing to identify and fix bugs, and utilize collaboration tools like GitHub for effective team working and code management. Staying updated with emerging technologies and applying them to operations will be essential, showcasing a strong desire for continuous learning. Qualifications and Experience: - Bachelor's degree in Computer Science, Information Technology, or a related field with at least 6 years of working experience. - Proven experience as a Full Stack Developer with a focus on designing, developing, and deploying end-to-end applications. - Knowledge of front-end languages and libraries such as HTML/CSS, JavaScript, XML, and jQuery. - Experience with Angular and NDBX frameworks, as well as database technologies like Postgres DB and vector databases. - Proficiency in developing APIs following OpenAPI standards. - Familiarity with generative AI models on cloud platforms like Azure and AWS, including techniques like Retrieval Augmented Generation, Prompt engineering, Agentic RAG, and Model context protocols. - Experience with collaboration tools like GitHub and docker images for packaging applications. At Allianz, we believe in fostering a diverse and inclusive workforce. We are proud to be an equal opportunity employer that values bringing your authentic self to work, regardless of background, appearance, preferences, or beliefs. Together, we can create an environment where everyone feels empowered to explore, grow, and contribute to a better future for our customers and the global community. Join us at Allianz and let's work together to care for tomorrow.,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Engineer - Data Engineering Career Level: C3 Introduction to role Are you ready to make a difference in the world of biopharmaceuticals? At AstraZeneca, we are committed to discovering and developing life-changing medicines for some of the most serious diseases. As a Data Engineer, you'll play a pivotal role in enhancing our data engineering capabilities within the D&A space. Join us in leveraging cutting-edge tools and technology to drive innovation and improve delivery performance. Are you up for the challenge? Accountabilities Provide data engineering support (ETL, Data Products, Reports) to the R&D IT portfolio. Deliver cost-effective solutions for data engineering activities, including data ETL pipelines using Python/PySpark/Lambda. Test and quality assess new D&A solutions to ensure they are fit for release, including code assurance, unit and system integration testing, data testing, release management control, and support of UAT processes. Ensure business data and information assets are available as data services and artefacts for consumption by the wider AstraZeneca enterprise. Essential Skills/Experience Relevant Experience: 3 to 6 Years Good Experience in Python/PySpark/Lambda and implement best practices for error handling, optimization, job layout, job design and naming conventions. Good Experience in Graph Database like Cypher QL / Neo4j / Neptune. Amazon Web Services: Connecting, loading, and reading data from AWS Services like S3, Athena, Glue, Aurora Experience and familiarity with data models and artefacts. Familiar with using version control (branching, merging etc), ideally Git. Knowledge of working with Python project branches, merging them, publishing, and deploying code to runtime environments. Experience of working with a range of data analytics architectures. These may include traditional warehousing, distributed computing, visualization analytics. Interpret data, process data, analyze results and provide ongoing support of productionized applications. Strong analytical skills with the ability to resolve production issues. Understanding of business area/process in scope. Willing to work in a cross-cultural environment across multiple time zones. Ability to work effectively independently or as part of a team to achieve objectives. Eager to learn and develop new tech skills as required. Good written and verbal skills, fluent English. Desirable Skills/Experience Good to have domain knowledge (processes & data): Pharma R&D Any regulatory experience such as GxP Agile/Scrum process ITIL knowledge When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we connect across the entire business to drive impactful transformation. Our commitment to becoming a digital and data-led enterprise means you'll be at the forefront of innovation, turning complex information into practical insights that improve patient outcomes. Collaborate with leading experts in our specialist communities and be part of novel solutions that shape the future of healthcare. Here, your work is valued and recognized, offering endless opportunities for personal growth and development. Ready to take your career to new heights? Apply now and be part of our journey towards transformative healthcare solutions! Date Posted 14-Jul-2025 Closing Date AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Overview Job Title- Data Science Engineer, AS Location- Bangalore, India Role Description We are seeking a Data Science Engineer to contribute to the development of intelligent, autonomous AI systems The ideal candidate will have a strong background in agentic AI, LLMs, SLMs, vector DB, and knowledge graphs. This role involves deploying AI solutions that leverage Retrieval-Augmented Generation (RAG) Multi-agent frameworks Hybrid search techniques to enhance enterprise applications. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Design & Develop Agentic AI Applications: Utilise frameworks like LangChain, CrewAI, and AutoGen to build autonomous agents capable of complex task execution. Implement RAG Pipelines: Integrate LLMs with vector databases (e.g., Milvus, FAISS) and knowledge graphs (e.g., Neo4j) to create dynamic, context-aware retrieval systems. Fine-Tune Language Models: Customise LLMs (e.g., Gemini, chatgpt, Llama) and SLMs (e.g., Spacy, NLTK) using domain-specific data to improve performance and relevance in specialised applications. NER Models: Train OCR and NLP leveraged models to parse domain-specific details from documents (e.g., DocAI, Azure AI DIS, AWS IDP) Develop Knowledge Graphs: Construct and manage knowledge graphs to represent and query complex relationships within data, enhancing AI interpretability and reasoning. Collaborate Cross-Functionally: Work with data engineers, ML researchers, and product teams to align AI solutions with business objectives and technical requirements. Optimise AI Workflows: Employ MLOps practices to ensure scalable, maintainable, and efficient AI model deployment and monitoring. Your Skills And Experience 4+ years of professional experience in AI/ML development, with a focus on agentic AI systems. Proficient in Python, Python API frameworks, SQL and familiar with AI/ML frameworks such as TensorFlow or PyTorch. Experience in deploying AI models on cloud platforms (e.g., GCP, AWS). Experience with LLMs (e.g., GPT-4), SLMs (Spacy), and prompt engineering. Understanding of semantic technologies, ontologies, and RDF/SPARQL. Familiarity with MLOps tools and practices for continuous integration and deployment. Skilled in building and querying knowledge graphs using tools like Neo4j. Hands-on experience with vector databases and embedding techniques. Experience in developing AI solutions for specific industries such as healthcare, finance, or e-commerce. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are looking for a Full stack lead software engineer with deep understanding of Java/Python and its ecosystems, and strong hands-on experience in building high-performing, scalable, enterprise-grade applications. You will be part of a talented software team that works on mission-critical applications. As a full stack lead software engineer, your responsibilities include understanding user requirements and leading a development team on the design, implementation and deliver of Java/Python application while providing expertise in the full software development lifecycle, from concept and design to testing. Candidate will be working closely with business architecture group to design and implement current and target state business process by using various tools and technologies. Candidate should ideally be having knowledge in few of these technologies like Java/Python/Unix technology stack, Angular, java script, SQL / NonSQL and Graph DB are used for data storage (we tailor the tools to the needs) and is integrated with other bank systems via RESTful APIs/web services and Kafka Streams. Qualifications: 10+ years of industry experience, with a strong hands-on experience in the hands-on development of mission-critical applications using Java/Python technologies, aligning each project with the firm's strategic objectives, and overseeing team operations to ensure project success. Experience with complex system integration projects. Java, Spring, Spring Boot, Spring Cloud, J2EE Design Patterns, REST services. Front End Technologies like JavaScript and Angular version, CSS2/CSS3, HTML Strong Knowledge of SQL, JDBC, Unix commands. Hands-on Database experience in relational (Oracle/DB2) and No-SQL (MongoDB). Hands-on experience on working / deploying application on Cloud. Hands-on experience in code testing tools like Junit / Mockito / Cucumber. Deployment Acquaintance in Apache Tomcat, Open shift or other cloud environments. Expertise in Test driven development (Junit, JMeter), Continuous Integration (Jenkins), Build tool (Maven) and Version Control (Git), Development tools (Eclipse, IntelliJ). Excellent communication skills (written and verbal), ability to work in a team environment. Excellent analytical and problem-solving skills and the ability to work well independently. Experience working with business analysts, database administrators, project managers and technical architects in multiple geographical areas. Experience in the Financial Services industry is added advantage. Understanding Financial and Reporting Hierarchies will be beneficial. Education : Bachelor’s or equivalent degree in Computer Science Experience : Minimum 10 + years of relevant experience developing applications/solutions preferably in the financial services industry. Required Skills: Minimum 10 + years of application development experience in Java/Python with: Spring Boot & Microservices; REST Web Services; JPA with hibernate; Core Java/Python. Minimum 6+ years of Hands-on experience in designing architecture for enterprise applications. Angular and Java Script Experience in working on a native cloud platform. Experience with development IDEs such as Eclipse and IntelliJ Experience with SQL/NONSQL such as Oracle, PostgreSQL, Neo4j, and MongoDB Experience with caching framework such as Redis. Experience with CI/CD systems such as helm and harness. Experience with messaging services such as Kafka. Experience in Python, Unix shell scripting will be an added plus Excellent trouble shooting skills. Strong problem-solving skills, business acumen, and demonstrated excellent oral and written communication skills with both technical and non-technical audiences. Skilled in customer and leadership presentations Experience with Agile Software Development Lifecycle methodology and related tooling. For example -JIRA, Scrum. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills Angular, Java, Microservice Framework, Spring Boot. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you’ll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won’t find anywhere else. When you join us, you will have: Continuous learning: Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters: From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community: With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. World-class benefits: On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package, which includes medical, dental, mental health, and vision coverage for you, your spouse/partner, and children. Your Impact You will work in multi-disciplinary global Life Science focused environments, harnessing data to provide real-world impact for organizations globally. Our Life Sciences practice focuses on helping clients bring life-saving medicines and medical treatments to patients. This practice is one of the fastest growing practices and is comprised of a tight-knit community of consultants, research, solution, data, and practice operations colleagues across the firm. It is also one of the most globally connected sector practices, offering ample global exposure. The LifeSciences.AI (LS.AI) team is the practice’s assetization arm, focused on creating reusable digital and analytics assets to support our client work. LS.AI builds and operates tools that support senior executives in pharma and device manufacturers, for whom evidence-based decision-making and competitive intelligence are paramount. The team works directly with clients across Research & Development (R&D), Operations, Real World Evidence (RWE), Clinical Trials and Commercial to build and scale digital and analytical approaches to addressing their most persistent priorities. What you’ll learn: How to apply data and machine learning engineering, as well as product development expertise, to address complex client challenges through part-time staffing on client engagements. How to support the manager of data and machine learning engineering in developing a roadmap for data and machine learning engineering assets across cell-level initiatives. How to productionalize AI prototypes and create deployment-ready solutions. How to translate engineering concepts and explain design/architecture trade-offs and decisions to senior stakeholders. How to write optimized code to enhance our AI Toolbox and codify methodologies for future deployment. How to collaborate effectively within a multi-disciplinary team. How to leverage new technologies and apply problem-solving skills in a multicultural and creative environment. You will work on the frameworks and libraries that our teams of Data Scientists and Data Engineers use to progress from data to impact. You will guide global companies through analytics solutions to transform their businesses and enhance performance across industries including life sciences, global energy and materials (GEM), and advanced industries (AI) practices. Real-World Impact – We provide unique learning and development opportunities internationally. Fusing Tech & Leadership – We work with the latest technologies and methodologies and offer first class learning programs at all levels. Multidisciplinary Teamwork - Our teams include data scientists, engineers, project managers, UX and visual designers who work collaboratively to enhance performance. Innovative Work Culture – Creativity, insight and passion come from being balanced. We cultivate a modern work environment through an emphasis on wellness, insightful talks and training sessions. Striving for Diversity – With colleagues from over 40 nationalities, we recognize the benefits of working with people from all walks of life. Your Qualifications and Skills Bachelor's degree in computer science or related field; master's degree is a plus 3+ years of relevant work experience Experience with at least one of the following technologies: Python, Scala, Java, C++ & ability to write production code and object-oriented programming Strong proven experience on distributed processing frameworks (Spark, Hadoop, EMR) and SQL / NoSQL is very much expected Commercial client- facing project experience is helpful, including working in close-knit teams Additional expertise with Python testing frameworks, data validation and data quality frameworks, feature engineering, chunking, document ingestion, graph data structures (i.e., Neo4j), basic K8s (manifests, debugging, docker, Argo Workflows), MLflow deployment and usage, generative AI frameworks (LangChain), GPUs is a plus Ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets Proven ability in clearly communicating complex solutions; strong attention to detail Understanding of information security principles to ensure compliant handling and management of client data Experience and interest in Cloud platforms such as: AWS, Azure, Google Platform or Databricks Experience with cloud development platforms such as AWS, Azure, Google (and appropriate Bash/Shell scripting) Good to have experience in CI/CD using GitHub Actions or CircleCI or any other CI/CD tech stack and experience in end to end pipeline development including application deployment

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

India

On-site

About Fincore At Fincore we’re on a mission to to build next-generation AI-native finance technology for enterprises Our core values - ownership, customer obsession, truth-seeking, and velocity - guide everything we do We are venture-backed and closely collaborate with seasoned technology, finance and AI leaders We maintain a small, talent-dense team of domain experts and technologists What We're looking for We are seeking an extremely talented, Lead to Principal-Level AI engineer to help us pioneer the future of finance and accounting. You must have experience building high-quality, complex, yet maintainable LLM apps and agents, and you should be able to do so in a fraction of the time that most competent people think is possible (in part because of your ability to wield the latest in code generation and intuition for prompt engineering). You must have a strong ability to collaborate with customers directly to build solutions tailored to their business needs. As a Lead AI Engineer , you will: Architect and build the core AI platform that powers finance and accounting solutions Design, develop, and productionize large‑scale LLMs, RAG pipelines, and autonomous agents Define and enforce best practices for prompt engineering, model fine‑tuning, and evaluation Mentor and grow a team of AI engineers, guiding technical design reviews and code quality Partner closely with Product, Data, and Customer teams to translate finance use cases into robust, scalable solutions Must-Have Skills 8+ years of experience building backend services in Python, Node.js, or similar Deep experience and intuition with LLMs Cutting-edge knowledge of code generation and prompting techniques Experience building agents and tooling for agents Expertise in API design (REST, GraphQL) and relational databases Strong problem-solving, communication, and collaboration skills Good to have Top-tier institutes preferred (IIT's, NIT's etc.) Experience building fintech software Experience working in a large enterprise / big tech Our Tech Stack: NextJS | Python | LangGraph | AWS | Neo4j | PostgreSQL | MongoDB

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Job Title: Python Developer – Data Science & AI Integration Location: Chandkheda, Ahmedabad, Gujarat 382424 Experience: 2–3 Years Employment Type: Full-time Work Mode: On-site About the Role We are seeking a talented and driven Python Developer to join our AI & Data Science team. The ideal candidate will have experience in developing backend systems, working with legal datasets, and integrating AI/LLM-based chatbots (text and voice). This is a hands-on role where you’ll work across modern AI architectures like RAG and embedding-based search using vector databases. Key Responsibilities Design and implement Python-based backend systems for AI and data science applications. Analyze legal datasets and derive insights through automation and intelligent algorithms. Build and integrate AI-driven chatbots (text & voice) using LLMs and RAG architecture. Work with vector databases (e.g., Pinecone, ChromaDB) for semantic search and embedding pipelines. Implement graph-based querying systems using Neo4j and Cypher. Collaborate with cross-functional teams (Data Scientists, Backend Engineers, Legal SMEs). Maintain data pipelines for structured, semi-structured, and unstructured data. Ensure code scalability, security, and performance. Required Skills & Experience 2–3 years of hands-on Python development experience in AI/data science environments. Solid understanding of legal data structures and preprocessing. Experience with LLM integrations (OpenAI, Claude, Gemini) and RAG pipelines. Proficiency in vector databases (e.g., Pinecone, ChromaDB) and embedding-based similarity search. Experience with Neo4j and Cypher for graph-based querying. Familiarity with PostgreSQL and REST API design. Strong debugging and performance optimization skills. Nice to Have Exposure to Agile development practices. Familiarity with tools like LangChain or LlamaIndex. Experience working with voice-based assistant/chatbot systems. Bachelor's degree in Computer Science, Data Science, or a related field. Why Join Us? Work on cutting-edge AI integrations in a domain-focused environment. Collaborate with a passionate and experienced cross-functional team. Opportunity to grow in legal-tech and AI solutions space.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As an ITIDATA, an EXl Company, you will be responsible for tasks including working with Cypher or Gremlin query languages, Neo4J, Python, PySpark, Hive, and Hadoop. Your expertise in graph theory will be utilized to create and manage knowledge graphs using Neo4J effectively. In this role, we are looking for Neo4J Developers with 7-10 years of experience in data engineering, specifically with 2-3 years of hands-on experience with Neo4J. If you are seeking an exciting opportunity in graph databases, this position offers the chance to work on optimizing performance and scalability of graph databases, as well as researching and implementing new technology solutions. Key Skills & Responsibilities: - Expertise in Cypher or Gremlin query languages - Strong understanding of graph theory - Experience in creating and managing knowledge graphs using Neo4J - Optimizing performance and scalability of graph databases - Researching & implementing new technology solutions - Working with application teams to integrate graph database solutions We are looking for candidates who are available immediately or within 30 days to join our team and contribute to our dynamic projects.,

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Pune, Maharashtra

On-site

DataPune Posted On 15 Jul 2025 End Date 31 Dec 2025 Required Experience 6 - 9 Years Basic Section Grade Role Lead Employment Type Full Time Employee Category Organisational Group Company NewVision Company Name New Vision Softcom & Consultancy Pvt. Ltd Function Business Units (BU) Department/Practice Data Organization Unit Data Science Region APAC Country India Base Office Location Pune Working Model Hybrid Weekly Off Pune Office Standard State Maharashtra Skills Skill DATA SCIENCE - AI MACHINE LEARNING PYTHON COMPUTRE VISION NLP NEO4J AWS Highest Education GRADUATION/EQUIVALENT COURSE CERTIFICATION No data available Working Language ENGLISH Job Description Senior Data Scientist – Gen AI, NLP & Graph Intelligence Experience: 7+ Years Location: Pune Employment Type: Full-time Role Overview: We are seeking a highly experienced and forward-thinking Senior Data Scientist to lead cutting-edge initiatives in Generative AI , Machine Learning , and Graph Intelligence . This role demands deep expertise in Neo4j , AWS Neptune , NLP , and LLM frameworks , with a strong foundation in predictive analytics and solution architecture. You will be instrumental in designing scalable, intelligent systems that transform data into actionable insights. Key Responsibilities: Lead the development and deployment of AI/ML models including predictive , prescriptive , and generative techniques. Architect and implement graph-based knowledge systems using Neo4j and AWS Neptune for semantic reasoning and relationship mapping. Design and fine-tune LLMs using frameworks like LangChain , LangGraph , LlamaIndex , and OpenAI/Azure OpenAI . Build and optimize RAG architectures , prompt engineering pipelines , and inferencing frameworks . Apply advanced NLP techniques for text analytics, document AI, OCR, sentiment analysis, entity recognition, and topic modeling. Collaborate with cross-functional teams to integrate models into production systems and drive business outcomes. Mentor junior data scientists and contribute to technical leadership across projects. Stay current with emerging trends in Gen AI, graph intelligence, and cloud-native ML. Required Skills & Experience: 8+ years of experience in programming with Python and SQL . 5+ years of hands-on experience in machine learning , including supervised and unsupervised algorithms, deep learning (RNN, LSTM, GRU), and artificial neural networks. 3+ years of experience in NLP , text analytics , and document intelligence . Proficiency in LangChain , LangGraph , and open-source LLM frameworks for tasks like summarization, classification, NER, and QA. Strong understanding of Generative AI techniques, prompt engineering , vector databases , and agentic frameworks . Hands-on experience with Neo4j , AWS Neptune , and knowledge graph modeling . Experience with RAG architecture , fine-tuning , and LLM deployment . Good to have familiarity with big data technologies and Microsoft Azure cloud services. Excellent problem-solving, communication, and stakeholder management skills. Preferred Qualifications: Bachelor's or Master’s or Ph.D. in Computer Science, Data Science, AI, or a related field. Experience with graph neural networks , semantic search , or knowledge graph reasoning . Exposure to ethical AI , data privacy , and responsible AI practices. Contributions to open-source AI/ML projects or research publications.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Seeking a highly experienced and technically adept AI/ML Engineer to spearhead a strategic initiative focused on analyzing annual changes in IRS-published TRCs and identifying their downstream impact on codebases. Role demands deep expertise in machine learning, knowledge graph construction, and software engineering processes. The ideal candidate will have a proven track record of delivering production-grade AI solutions in complex enterprise environments. Key Responsibilities: Design and development of an AI/ML-based system to detect and analyze differences in IRS TRC publications year-over-year. Implement knowledge graphs to model relationships between TRC changes and impacted code modules. Collaborate with tax domain experts, software engineers, and DevOps teams to ensure seamless integration of the solution into existing workflows. Define and enforce engineering best practices, including CI/CD, version control, testing, and model governance. Drive the end-to-end lifecycle of the solution—from data ingestion and model training to deployment and monitoring. Ensure scalability, performance, and reliability of the deployed system in a production environment. Mentor junior engineers and contribute to a culture of technical excellence and innovation. Required Skills & Experience: 8+ years of experience in software engineering, with at least 5 years in AI/ML solution delivery. Strong understanding of tax-related data structures, especially IRS TRCs, is a plus. Expertise in building and deploying machine learning models using Python, TensorFlow/PyTorch, and ML Ops frameworks. Hands-on experience with Knowledge graph technologies (e.g., Neo4j, RDF, SPARQL, GraphQL). Deep familiarity with software architecture, microservices, and API design. Experience with NLP techniques for document comparison and semantic analysis. Proven ability to lead cross-functional teams and deliver complex projects on time. Strong communication and stakeholder management skills. Preferred Qualifications: Experience working on regulatory or compliance-driven AI applications. Familiarity with code analysis tools and static/dynamic code mapping techniques. Exposure to cloud platforms (Azure, AWS, GCP) and containerization (Docker, Kubernetes). Contributions to open-source AI/ML or graph-based projects. Skill Set Required -Must Have 1. AI/ML 2. Python, TensorFlow/PyTorch, and ML Ops frameworks 3 Knowledge graph technologies 4 Data migration testing 5 Azure DevOps 6 Azure AI 7 US Tax understanding

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

AI Developer – Student Engagement & Reporting Tool Full-time · Gurugram (onsite ) · Immediate start Why we’re hiring Growth Valley Community (GVC) turns ambitious teens into real-world builders. Our next leap is the Our AI tool —an AI layer that ingests every student artefact. We need an engineer who can own that engine end-to-end. What you’ll build Multi-agent pipeline – craft & fine-tune LLM prompts that tag skills, spot gaps, and recommend next steps. Engagement brains – auto-generate nudges, streaks and Slack / WhatsApp messages that keep WAU > 75 %. Parent & school dashboards – real-time, privacy-safe progress views (FastAPI + React). Data plumbing – vector DB (Pinecone/Weaviate) + graph DB (Neo4j) + Supabase/Postgres for event firehose. Guardrails – PII redaction, hallucination tests, compliance logging. Must-haves 2 + yrs Python ML/infra (TensorFlow/PyTorch, FastAPI, Docker). Hands-on with LLM APIs (OpenAI, Anthropic, or similar) & prompt engineering. Experience shipping production rec-sys or NLP features at scale. Comfortable with vector search, embeddings, and at least one graph database. Git, CI/CD, cloud (AWS/GCP) second nature. Nice-to-haves Ed-tech or youth-focused product background. LangChain / LlamaIndex, Supabase Edge Functions. Basic front-end chops (React, Next.js) for rapid UI tweaks. Culture & perks Speed over slide decks We demo every Friday—shipping beats pitching. Lean squad Impact Your model recommendations land in 5 k+ teen inboxes each week. How to apply Send a GitHub / Kaggle link + résumé to sam@growthvalleycommunity.com with subject “AI Dev – ” . Include one paragraph on the coolest ML system you’ve shipped (numbers, not adjectives). We reply to every application within five working days and run a single 90-min tech interview (live coding + system design). Build the engine that maps teen talent to tangible outcomes—and see your models change lives, not ad clicks.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Senior Software Engineer – Backend As a Backend developer for a game development team you would be part of a small passionate team building a mobile AAA game. In addition to standard backend development work, you will get a unique opportunity to try interesting technologies like game networking, voice chat etc. What you'll do Microservices for cross cutting functionality like user management , game currency management Deployment architecture of game backend and UDP based voice chat system A small but scalable data platform analytics What you’ll will bring Well, apart from a few cookies every day, it’d be great if you come with these 2 - 5 years of experience Expertise on any one of Java/Golang Decent proficiency on C++ Ability to learn other languages like NodeJS and Java Strong problem-solving abilities to identify simple solutions to complex problems Exposure to TCP/UDP socket programming and underlying concepts Good understanding of the concepts of distributed computing and multithreading Good knowledge of DB and Cache technologies (We use MySQL, Redis, Cassandra, Neo4j, MongoDB, Aerospike) Good knowledge of message brokers (We use Kafka) Exposue to AWS Work Culture A true startup culture - young, fast-paced, where you are driven by personal ownership of solving challenges that help you grow fast Focus on innovation, data orientation, being results driven, taking on big goals, and adapting fast A high performance, meritocratic environment, where we share ideas, debate and grow together with each new product Massive and direct impact on the work you do. Growth through solving dynamic challenges Leveraging technology & analytics to solve large scale challenges Working with cross functional teams to create great product and take them to market Rub shoulders with some of the brightest & most passionate people in the gaming & consumer internet industry Compensation & Benefits Attractive compensation and ESOP packages INR 5 Lakh medical insurance cover for yourself and your family Fair & transparent performance appraisals An attractive Car Lease policy Relocation benefits A vibrant office space with fully stocked pantries. And your lunch is on us!

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Job Title : Machine Learning Engineer (NLP & Graph Analytics) Company Overview FWS is at the forefront of innovation in AI and ML .We are dedicated to solving complex challenges and creating impactful solutions through the power of artificial intelligence and machine learning. Our team is composed of passionate and driven experts who are committed to pushing the boundaries of what's possible. We are currently seeking a highly skilled and motivated Machine Learning Engineer to join our dynamic team and contribute to our next generation of intelligent products. Job Summary We are looking for an experienced Machine Learning Engineer with a strong background in building and deploying sophisticated machine learning models. The ideal candidate will have a proven track record of working with diverse datasets, a deep understanding of natural language processing (NLP), and hands-on experience with state-of-the-art deep learning frameworks. In this role, you will be responsible for the entire lifecycle of machine learning model development, from data acquisition and preprocessingincluding JSON extraction and graph database modeling with Neo4jto training, evaluation, and deployment via APIs. You will work with cutting-edge technologies, including Stanford NLP and large language models like Llama 3.2, to develop innovative solutions that drive our business forward. Key Responsibilities ML Model Development & Optimization : Design, develop, and implement robust machine learning models using Python and its associated libraries. You will be responsible for the end-to-end model lifecycle, including data preprocessing, feature engineering, model training, and performance evaluation. Deep Learning & NLP Utilize your expertise in deep learning with PyTorch to build and fine-tune complex neural networks. A key focus will be on Natural Language Processing (NLP) tasks, leveraging the Stanford NLP library for tasks such as sentiment analysis, named entity recognition, and coreference resolution. Large Language Models (LLMs) Stay at the forefront of AI by working with and fine-tuning large language models, specifically Llama 3.2. You will explore and implement its capabilities for tasks such as text generation, summarization, and question answering. Data Engineering & Analytics Conduct in-depth data analysis and handle complex data pipelines. This includes robust JSON extraction, transformation of semi-structured data, and modeling complex relationships using graph databases like Neo4j to build knowledge graphs that feed our ML systems. API Development & Deployment Develop and maintain scalable RESTful APIs to serve machine learning models in a production environment. Collaborate with our engineering teams to ensure seamless integration and contribute to our MLOps practices for scalability and reliability. Research & Innovation Stay current with the latest advancements in machine learning, deep learning, and NLP. Proactively identify and champion new technologies and techniques that can enhance our products and processes. Qualifications And Skills Educational Background : A Master's or Ph.D. in Computer Science, Artificial Intelligence, Machine Learning, or a related field. A Bachelor's degree with significant relevant experience will also be considered. Programming Proficiency : Expert-level programming skills in Python and a strong command of its data science and machine learning ecosystem (e.g., Pandas, NumPy, Scikit-learn). Deep Learning Expertise : Proven experience in building and deploying deep learning models using PyTorch. Natural Language Processing (NLP) : Demonstrable experience with NLP techniques and libraries, with specific expertise in using the Stanford NLP toolkit. Large Language Models (LLMs) : Hands-on experience with and a strong understanding of large language models, including practical experience with models like Llama 3.2. Data Handling : Strong proficiency in handling and parsing various data formats, with specific experience in JSON extraction and manipulation. API Development : Solid experience in developing and deploying models via RESTful APIs using frameworks like Flask, FastAPI, or Django. Graph Database Knowledge : Experience with graph databases, particularly Neo4j, and understanding of graph data modeling and querying (Cypher). Problem-Solving Skills : Excellent analytical and problem-solving abilities with a talent for tackling complex and ambiguous challenges. Preferred Qualifications Published research in top-tier machine learning or NLP conferences. Experience with MLOps tools and cloud platforms (e.g., AWS, GCP, Azure). Familiarity with other NLP libraries and frameworks (e.g., spaCy, Hugging Face Transformers). Contributions to open-source machine learning projects. (ref:hirist.tech)

Posted 2 weeks ago

Apply

13.0 years

0 Lacs

Andhra Pradesh, India

On-site

Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes and planning. You’ll be at the forefront of helping organizations around the globe adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. . Job Description Senior Data Architect with experience in design, build, and optimization of complex data landscapes and legacy modernization projects. The ideal candidate will have deep expertise in database management, data modeling, cloud data solutions, and ETL (Extract, Transform, Load) processes. This role requires a strong leader capable of guiding data teams and driving the design and implementation of scalable data architectures. Key areas of expertise include Design and implement scalable and efficient data architectures to support business needs. Develop data models (conceptual, logical, and physical) that align with organizational goals. Lead the database design and optimization efforts for structured and unstructured data. Establish ETL pipelines and data integration strategies for seamless data flow. Define data governance policies, including data quality, security, privacy, and compliance. Work closely with engineering, analytics, and business teams to understand requirements and deliver data solutions. Oversee cloud-based data solutions (AWS, Azure, GCP) and modern data warehouses (Snowflake, BigQuery, Redshift). Ensure high availability, disaster recovery, and backup strategies for critical databases. Evaluate and implement emerging data technologies, tools, and frameworks to improve efficiency. Conduct data audits, performance tuning, and troubleshooting to maintain optimal performance Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 13+ years of experience in data modeling, including conceptual, logical, and physical data design. 5 – 8 years of experience in cloud data lake platforms such as AWS Lake Formation, Delta Lake, Snowflake or Google Big Query. Proven experience with NoSQL databases and data modeling techniques for non-relational data. Experience with data warehousing concepts, ETL/ELT processes, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience delivering complex, multi-module projects in diverse technology ecosystems. Strong understanding of data governance, data security, and compliance best practices. Proficiency with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). Excellent leadership and communication skills, with a proven ability to manage teams and collaborate with stakeholders. Preferred Skills Experience with modern data architectures, such as data fabric or data mesh. Knowledge of graph databases and modeling for technologies like Neo4j. Proficiency with programming languages like Python, Scala, or Java. Understanding of CI/CD pipelines and DevOps practices in data engineering.

Posted 2 weeks ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Tiger Analytics is a global AI and analytics consulting firm that is at the forefront of solving complex problems using data and technology. With a team of over 2800 experts spread across the globe, we are dedicated to making a positive impact on the lives of millions worldwide. Our culture is built on expertise, respect, and collaboration, with a focus on teamwork. While our headquarters are in Silicon Valley, we have delivery centers and offices in various cities in India, the US, UK, Canada, and Singapore, as well as a significant remote workforce. As an Azure Big Data Engineer at Tiger Analytics, you will be part of a dynamic team that is driving an AI revolution. Your typical day will involve working on a variety of analytics solutions and platforms, including data lakes, modern data platforms, and data fabric solutions using Open Source, Big Data, and Cloud technologies on Microsoft Azure. Your responsibilities may include designing and building scalable data ingestion pipelines, executing high-performance data processing, orchestrating pipelines, designing exception handling mechanisms, and collaborating with cross-functional teams to bring analytical solutions to life. To excel in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be well-versed in technologies such as Azure Data Factory, PySpark, Databricks, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Your passion for writing high-quality, scalable code and your ability to collaborate effectively with stakeholders are essential for success in this role. Experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, and Neo4J, as well as knowledge of different file formats and REST API design, will be advantageous. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with varying skills and backgrounds to apply. We are committed to providing equal opportunities for all our employees and fostering a culture of trust, respect, and growth. Your compensation package will be competitive and aligned with your expertise and experience. If you are looking to be part of a forward-thinking team that is pushing the boundaries of what is possible in AI and analytics, we invite you to join us at Tiger Analytics and be a part of our exciting journey towards building innovative solutions that inspire and energize.,

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Cognite Embark on a transformative journey with Cognite, a global SaaS forerunner in leveraging AI and data to unravel complex business challenges through our cutting-edge offerings including Cognite Atlas AI, an industrial agent workbench, and the Cognite Data Fusion (CDF) platform. We were awarded the 2022 Technology Innovation Leader for Global Digital Industrial Platforms & Cognite was recognized as 2024 Microsoft Energy and Resources Partner of the Year. In the realm of industrial digital transformation, we stand at the forefront, reshaping the future of Oil & Gas, Chemicals, Pharma and other Manufacturing and Energy sectors. Join us in this venture where AI and data meet ingenuity, and together, we forge the path to a smarter, more connected industrial future. Learn More About Cognite Here Cognite Product Tour 2024 Cognite Product Tour 2023 Data Contextualization Masterclass 2023 Our values Impact : Cogniters strive to make an impact in all that they do. We are result-oriented, always asking ourselves. Ownership : Cogniters embrace a culture of ownership. We go beyond our comfort zones to contribute to the greater good, fostering inclusivity and sharing responsibilities for challenges and success. Relentless : Cogniters are relentless in their pursuit of innovation. We are determined and deliverable (never ruthless or reckless), facing challenges head-on and viewing setbacks as opportunities for growth. About Atlas AI Our ambition with Cognite Atlas AI is to build the framework and the industrial agents that will truly revolutionize how manufacturing and energy are operated. This framework and these agents cannot happen without Cognite Data Fusion and its industrial fluency (from data types to simulators and specialized data models on its knowledge graph). Atlas AI has the potential to position Cognite as the operating system for industry. To ensure that we reach this goal we have to proactively engage with the market and anchor this vision with all influential stakeholders. We need to garner active support from industry leaders, government bodies and the ecosystem of suppliers and service companies to commit to the journey. This means convincing them of the uniqueness of our approach and engaging in concrete projects on the path to this vision. The Atlas AI team and more specifically its deeply technical co-innovation team will execute the delivery of co-innovation projects on Cognite Data fusion leveraging Atlas AI. The Technical Senior Full Stack Engineer will be part of the co-innovation pod that will closely engage with strategic clients to extend Atlas AI capabilities to deliver innovative solutions and expand our understanding of GenAI value for Industry. In this role, you would We're seeking an experienced Front End Engineer to join our AtlasAI co-innovation team building cutting-edge industrial agents and workflows. Architect and build responsive, modern web applications using React and TypeScript that integrate with large language models through Cognite’s Data Fusion Design, develop, and implement full-stack generative AI solutions with a strong focus on AI agents, multi-agent systems, and the latest generative AI technologies to drive business innovation and enhance customer experiences Collaborate with solution architects, data engineers, and domain experts to develop scalable AI agents that integrate seamlessly with existing systems Implement best practices for software development, including Git workflows and CI/CD pipelines to meet robust quality criteria Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications for generative AI solutions while actively contributing to product strategy and technical decision-making Engage with customers and stakeholders in a forward-deployed capacity, co-innovating and shaping solutions that address real business needs We Believe Most Of These Should Match Your Experience 5+ years of Product development experience in software engineering with a focus on GenAI/ML applications Proficiency with working in full-stack contexts, using languages such as JavaScript/TypeScript and Python, or other relevant programming languages. Proficiency with React framework Experience using debugging tools to diagnose and resolve complex issues. Familiarity with knowledge graphs, graph databases (GraphQL, Neo4j, etc.) or related technologies Familiarity with cloud deployment environments (AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes). Experience in developing and deploying multi-agent systems, preferably using frameworks like LangChain Understanding of multi-agent frameworks, including agent communication, decision-making, and learning strategies. Experience with REST API development and integration. Experience with Python ML/AI frameworks and SDKs Strong problem-solving skills and the ability to think critically about complex systems. Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders. Ability to work in a fast-paced, collaborative environment and manage multiple priorities. Proficiency in reading and writing data in various formats (CSV, JSON, SQL) and using storage tools like SQLite and SQL databases. Join the global Cognite community! 🌐 Join an organization of 70 different nationalities 🌐 with Diversity, Equality and Inclusion (DEI) in focus 🤝 Office location Rathi Legacy (Rohan Tech Park ) Hoodi (Bengaluru) A highly modern and fun working environment with sublime culture across the organization, follow us on Instagram @ cognitedata 📷 to know more Flat structure with direct access to decision-makers, with minimal amount of bureaucracy Opportunity to work with and learn from some of the best people on some of the most ambitious projects found anywhere, across industries Join our HUB 🗣️ to be part of the conversation directly with Cogniters and our partners. Hybrid work environment globally Why choose Cognite? 🏆 🚀 Join us in making a real and lasting impact in one of the most exciting and fastest-growing new software companies in the world. We have repeatedly demonstrated that digital transformation, when anchored on strong DataOps, drives business value and sustainability for clients and allows front-line workers, as well as domain experts, to make better decisions every single day. We were recognized as one of CNBC's top global enterprise technology startups powering digital transformation ! And just recently, Frost & Sullivan named Cognite a Technology Innovation Leader ! 🥇 Most recently Cognite Data Fusion® Achieved Industry First DNV Compliance for Digital Twins 🥇 Apply today! If you're excited about the opportunity to work at Cognite and make a difference in the tech industry, we encourage you to apply today! We welcome candidates of all backgrounds and identities to join our team. We encourage you to follow us on Cognite LinkedIn ; we post all our openings there.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

haryana

On-site

We are looking for a highly skilled AI/ML Engineer with expertise in developing machine learning solutions, utilizing graph databases like Neo4j, and constructing scalable production systems. As our ideal candidate, you should have a strong passion for applying artificial intelligence, machine learning, and data science techniques to solve real-world problems. You should also possess experience in dealing with complex rules, logic, and reasoning systems. Your responsibilities will include designing, developing, and deploying machine learning models and algorithms for production environments, ensuring their scalability and robustness. You will be expected to utilize graph databases, particularly Neo4j, to model, query, and analyze data relationships in large-scale connected data systems. Building and optimizing ML pipelines to ensure they are production-ready and capable of handling real-time data volumes will be a crucial aspect of your role. In addition, you will develop rule-based systems and collaborate with data scientists, software engineers, and product teams to integrate ML solutions into existing products and platforms. Implementing algorithms for entity resolution, recommendation engines, fraud detection, or other graph-related tasks using graph-based ML techniques will also be part of your responsibilities. You will work with large datasets, perform exploratory data analysis, feature engineering, and model evaluation. Post-deployment, you will monitor, test, and iterate on ML models to ensure continuous improvement in model performance and adaptability. Furthermore, you will participate in architectural decisions to ensure the efficient use of graph databases and ML models, while staying up-to-date with the latest advancements in AI/ML research, particularly in graph-based machine learning, reasoning systems, and logical AI. Requirements: - Bachelor's or Master's degree in Computer Science, Machine Learning, Artificial Intelligence, or a related field. - 1+ years of experience in AI/ML engineering or a related field, with hands-on experience in building and deploying ML models in production environments and self-made projects using graph databases. - Proficiency in Neo4j or other graph databases, with a deep understanding of Cypher query language and graph theory concepts. - Strong programming skills in Python, Java, or Scala, along with experience using ML frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). - Experience with machine learning pipelines and tools like Airflow, Kubeflow, or MLflow for model tracking and deployment. - Hands-on experience with rule engines or logic programming systems (e.g., Drools, Prolog). - Experience with cloud platforms such as AWS, GCP, or Azure for ML deployments. - Familiarity with containerization and orchestration technologies like Docker and Kubernetes. - Experience working with large datasets, SQL/NoSQL databases, and handling data preprocessing at scale.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

STAND 8 provides end-to-end IT solutions to enterprise partners across the United States and with offices in Los Angeles, New York, New Jersey, Atlanta, and more including internationally in Mexico and India. We are seeking a Senior AI Engineer / Data Engineer to join our engineering team and help build the future of AI-powered business solutions. In this role, you'll be developing intelligent systems that leverage advanced large language models (LLMs), real-time AI interactions, and cutting-edge retrieval architectures. Your work will directly contribute to products that are reshaping how businesses operate-particularly in recruitment, data extraction, and intelligent decision-making. This is an exciting opportunity for someone who thrives in building production-grade AI systems and working across the full stack of modern AI technologies. Responsibilities Design, build, and optimize AI-powered systems using multi-modal architectures (text, voice, visual). Integrate and deploy LLM APIs from providers such as OpenAI, Anthropic, and AWS Bedrock. Build and maintain RAG (Retrieval-Augmented Generation) systems with hybrid search, re-ranking, and knowledge graphs. Develop real-time AI features using streaming analytics and voice interaction tools (e.g., ElevenLabs). Build APIs and pipelines using FastAPI or similar frameworks to support AI workflows. Process and analyze unstructured documents with layout and semantic understanding. Implement predictive models that power intelligent business recommendations. Deploy and maintain scalable solutions using AWS services (EC2, S3, RDS, Lambda, Bedrock, etc.). Use Docker for containerization and manage CI/CD workflows and version control via Git. Debug, monitor, and optimize performance for large-scale data pipelines. Collaborate cross-functionally with product, data, and engineering teams. Qualifications 5+ years of experience in AI/ML or data engineering with Python in production environments. Hands-on experience with LLM APIs and frameworks such as OpenAI, Anthropic, Bedrock, or LangChain. Production experience using vector databases like PGVector, Weaviate, FAISS, or Pinecone. Strong understanding of NLP, document extraction, and text processing. Proficiency in AWS cloud services including Bedrock, EC2, S3, Lambda, and monitoring tools. Experience with FastAPI or similar frameworks for building AI/ML APIs. Familiarity with embedding models, prompt engineering, and RAG systems. Asynchronous programming knowledge for high-throughput pipelines. Experience with Docker, Git workflows, CI/CD pipelines, and testing best practices. Preferred Background in HRTech or ATS integrations (e.g., Greenhouse, Workday, Bullhorn). Experience working with knowledge graphs (e.g., Neo4j) for semantic relationships. Real-time AI systems (e.g., WebRTC, OpenAI Realtime API) and voice AI tools (e.g., ElevenLabs). Advanced Python development skills using design patterns and clean architecture. Large-scale data processing experience (1-2M+ records) with cost optimization techniques for LLMs. Event-driven architecture experience using AWS SQS, SNS, or EventBridge. Hands-on experience with fine-tuning, evaluating, and deploying foundation models.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies