Home
Jobs

356 Neo4J Jobs - Page 11

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 3 weeks ago

Apply

6.0 - 12.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description We are looking for a highly skilled GCP Technical Lead with 6 to 12 years of experience to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, secure, and highly available cloud infrastructure solutions on Google Cloud Platform (GCP). You will lead the architecture and development of cloud-native applications and ensure that infrastructure and applications are optimized for performance, security, and scalability. Your expertise will play a key role in the design and execution of workload migrations, CI/CD pipelines, and infrastructure : Cloud Architecture and Design : Lead the design and implementation of scalable, secure, and highly available cloud infrastructure solutions on GCP using services like Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Develop architecture design and guidelines for the development, deployment, and lifecycle management of cloud-native applications, ensuring optimization for security, performance, and scalability with services such as App Engine, Cloud Functions, and Cloud Run. API Management : Implement secure API interfaces and granular access control using IAM, RBAC, and API Gateway for workloads running on GCP. Workload Migration : Lead the migration of on-premises workloads to GCP, ensuring minimal downtime, data integrity, and smooth transitions. CI/CD : Design and implement CI/CD pipelines using Cloud Build, Cloud Source Repositories, and Artifact Registry to automate development and deployment processes. Infrastructure as Code (IaC) : Automate cloud infrastructure provisioning and management using Terraform. Collaboration : Collaborate closely with cross-functional teams to define requirements, design solutions, and ensure successful project delivery, utilizing tools like Google Workspace and Jira. Monitoring and Optimization : Continuously monitor cloud environments to ensure optimal performance, availability, and security, and perform regular audits and tuning. Documentation : Prepare and maintain comprehensive documentation for cloud infrastructure, configurations, and procedures using Google Docs and Qualifications : Bachelors degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary We are seeking a highly skilled Data Engineer with expertise in leveraging Data Lake architecture and the Azure cloud platform to develop, deploy, and optimise data-driven solutions. . You will play a pivotal role in transforming raw data into actionable insights, supporting strategic decision-making across the organisation. Responsibilities Design and implement scalable data science solutions using Azure Data Lake, Azure Data Bricks, Azure Data Factory and related Azure services. Develop, train, and deploy machine learning models to address business challenges. Collaborate with data engineering teams to optimise data pipelines and ensure seamless data integration within Azure cloud infrastructure. Conduct exploratory data analysis (EDA) to identify trends, patterns, and insights. Build predictive and prescriptive models to support decision-making processes. Expertise in developing end-to-end Machine learning lifecycle utilizing crisp-DM which includes of data collection, cleansing, visualization, preprocessing, model development, model validation and model retraining Proficient in building and implementing RAG systems that enhance the accuracy and relevance of model outputs by integrating retrieval mechanisms with generative models. Ensure data security, compliance, and governance within the Azure cloud ecosystem. Monitor and optimise model performance and scalability in production environments. Prepare clear and concise documentation for developed models and workflows. Skills Required Good experience in using Pyspark, Python, MLops (Optional), ML flow (Optional), Azure Data Lake Storage. Unity Catalog Worked and utilized data from various RDBMS like MYSQL, SQL Server, Postgres and NoSQL databases like MongoDB, Cassandra, Redis and graph DB like Neo4j, Grakn. Proven experience as a Data Engineer with a strong focus on Azure cloud platform and Data Lake architecture. Proficiency in Python, Pyspark, Hands-on experience with Azure services such as Azure Data Lake, Azure Synapse Analytics, Azure Machine Learning, Azure Databricks, and Azure Functions. Strong knowledge of SQL and experience in querying large datasets from Data Lakes. Familiarity with data engineering tools and frameworks for data ingestion and transformation in Azure. Experience with version control systems (e.g., Git) and CI/CD pipelines for machine learning projects. Excellent problem-solving skills and the ability to work collaboratively in a team environment. Show more Show less

Posted 3 weeks ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 3 weeks ago

Apply

2.0 - 3.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Solid experience in NodeJS, TypeScript, React, Neo4j and Firestore (GCP) . In-depth knowledge in software design & development practices Design and develop scalable systems using advanced concepts in NodeJS, TypeScript, Javascript, and React. Should have a better understanding of deploying and working with GKE. Ability to design for scale and performance/peer code reviews Architecture/platform development, API, data modelling at scale Excellent working experience in Express, Knex, Serverless GC Functions Solid experience in JavaScript Frameworks (Angular / React.JS), Redux, JavaScript , JQuery, CSS, HTML5, ES5, ES6 & ES7, in-memory databases (Redis / Hazelcast), Build tools (web pack) Good Error and Exceptional Handling Skills. Ability to work with Git repositories, and remote code hosting services like GitHub and Gitlab Ability to deliver amazing results with minimal guidance and supervision Passionate (especially about web development!), highly motivated, and fun to work with React, Es6 (5), Html & Css, Javascript & Typescript, Api, Node

Posted 3 weeks ago

Apply

10.0 - 20.0 years

25 - 35 Lacs

Pune

Remote

Naukri logo

Job Description Technical Delivery Manager Saama Technologies Responsibilities: Oversee the end-to-end development and delivery of the Graph RAG system. Manage project timelines, ensuring timely delivery and adherence to milestones. Establish and maintain strong communication with client technical leads, providing regular updates and addressing technical concerns. Offer technical leadership and expertise in Graph Databases (e.g., Neo4j) and LLM-based applications. Collaborate with the team on architectural decisions, ensuring solutions are scalable, robust, and aligned with client requirements. Mitigate technical risks and address challenges proactively. Qualifications: Proven experience in technical project management and delivery, ideally within the AI/ML or data science domain. Strong understanding of Graph Databases and LLM-based systems. Experience with cloud-based development and deployment (AWS, GCP, or Azure). Excellent communication and interpersonal skills, with the ability to bridge the gap between technical and non-technical stakeholders. Ability to work independently and lead a team in a fast-paced environment. Experience with Agile methodologies. Required Skills: Knowledge of Graph Databases (Neo4) - Experience with LLM-based systems - Proficiency in LangChain - API development and cloud deployment expertise - Experience managing engineering teams and Agile methodologies Desired Skills: Familiarity with LangChain and API development. Knowledge of MLOps and CI/CD practices.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Our world is transforming, and PTC is leading the way.Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. : PTC is a dynamic and innovative company dedicated to creating innovative products that transform industries and improve lives. We are looking for a talented Product Architect that will be able to lead the conceptualization and development of groundbreaking products, and leverage the power of cutting edge AI technologies to drive enhanced productivity and innovation. Job Description: Responsibilities: Design and implement scalable, secure, and high-performing Java applications. Focus on designing, building, and maintaining complex, large-scale systems with intrinsic multi-tenant SaaS characteristics. Define architectural standards, best practices, and technical roadmaps. Lead the integration of modern technologies, frameworks, and cloud solutions. Collaborate with DevOps, product teams, and UI/UX designers to ensure cohesive product development. Conduct code reviews, mentor developers, and enforce best coding practices. Stay up-to-date with the latest design patterns, technological trends, and industry best practices. Ensure scalability, performance, and security of product designs. Conduct feasibility studies and risk assessments. Requirements: Proven experience as a Software Solution Architect or similar role. Strong expertise in vector and graph databases (e.g., Pinecone, Chroma DB, Neo4j, ArangoDB, Elastic Search). Extensive experience with content repositories and content management systems. Familiarity with SaaS and microservices implementation models. Proficiency in programming languages such as Java, Python, or C#. Excellent problem-solving skills and ability to think strategically. Strong technical, analytical, communication, interpersonal, and presentation skills. Bachelor's or Master's degree in Computer Science, Engineering, or related field. Experience with cloud platforms (e.g., AWS, Azure). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Experience with artificial intelligence (AI) and machine learning (ML) technologies. Benefits: Competitive salary and benefits package. Opportunities for professional growth and development. Collaborative and inclusive work environment. Flexible working hours and hybrid work options. Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title : Java Backend Developer (Contract to Hire) Experience : 6 to 10 Years Job Location : Bengaluru / Chennai Interview Mode : Face-to-Face Interview Date : 24th May 2025 Venue : [Will be shared shortly] · Notice period :- Immediate joiners. Standard Job Requirements 6+ Years of experience in Application Development using Java and Advance Technologies tool Strong understanding of fundamental architecture and design principles, object-orientation principles, and coding standards Ability to design and build smart, scalable, and resilient solutions with tight deadlines, both high and low-level. Strong analytical and problem-solving skills Strong verbal and written communication skills Good knowledge in DevOps, CI-CD Understanding on source control, versioning, branching etc. Experienced in Agile methodology and Waterfall models Strong experience in Application Delivery, that also includes Production Support Very Good presentation and documentation skills Ability to learn and adapt to new technologies and frameworks Awareness about Release Management Strong team player who can collaborate effectively with relevant stakeholders Recommend future technology capabilities and architecture design considering business objectives, technology strategy, trends and regulatory requirements. Technical Competence Must Have Strong programming and hands-on skills in Java 8 or above (preferably Java 17) Good hands on Java Collections and Streams Good hands on Data structure and Algorithms. Good experience in developing vulnerable free Spring Framework applications Good knowledge on Spring DI/Blueprints, Spring Boot, etc. Good knowledge about Design Patterns and Principles Good knowledge on OR frameworks like Hibernate, JPA etc. Good knowledge on API building (Web Service, SOAP/REST) Good knowledge on Unit testing and code coverage using JUnit/Mockito Good knowledge on code quality tools like SonarQube, Security Scans etc. Good knowledge on containerized platforms like Kubernetes, OpenShift, EKS (AWS) Good knowledge in Enterprise Application Integration patterns (synchronous, asynchronous) Good knowledge on multi-threading and multi-processing implementations Experience in RDBMS (Oracle, PostgreSQL, MySQL) Knowledge on SQL queries Ability to work in quick paced, dynamic environment adapting agile methodologies Ability to work with minimal guidance and/or high-level design input Knowledge on Microservices based development and implementation Knowledge on CI-CD pattern with related tools like Azure DevOps, GIT, Bitbucket, etc. Knowledge on JSON libraries like Jackson/GSON Knowledge on basic Unix Commands Possess good documentation and presentation skills Able to articulate ideas, designs, and suggestions Mentoring fellow team members, conducting code reviews Good to Have Hands-on skills in J2EE specifications like JAX-RS, JAX-WS Experience in working and supporting OLTP and OLAP systems Good Knowledge on Spring Batch, Spring Security Good knowledge in Linux Operating System (Preferably RHEL) Good knowledge on NoSQL offerings (Cassandra, MongoDB, GraphDB, etc) Knowledge on testing methodologies like performance testing, smoke testing, stress testing, endurance testing, etc. Knowledge in Python, Groovy Knowledge in middleware technologies like Kafka, Solace etc. Knowledge in DSE DataStax or Neo4j Cloud environments knowledge (AWS / Azure etc.) Knowledge on IMDG (Hazelcast, Ignite) Knowledge on Rule Engines like Drools, OpenL Tablets, Easy Rules etc. Experience in presenting solutions to architecture forums and follow the principles and standards in implementation Domain: Good to Have Experience in application development for Client Due Diligence (CDD), On-boarding, FATCA & CRS, AML, KYC, and Screening Good knowledge on Cloud native application development, and knowledge of Cloud computing services Training, Qualifications and Certifications Training/qualifications and Certifications in some of the functional and/or technical domains as mentioned will be an added advantage Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Kolkata metropolitan area, West Bengal, India

On-site

Linkedin logo

We are seeking an experienced AI Solution Architect to lead the design and implementation of AI-driven, cloud-native applications. The ideal candidate will possess deep expertise in Generative AI, Agentic AI, cloud platforms (AWS, Azure, GCP), and modern data engineering practices. This role involves collaborating with cross-functional teams to deliver scalable, secure, and intelligent solutions in a fast-paced, innovation-driven environment. Key Responsibilities: Design and architect AI/ML solutions, including Generative AI, Retrieval-Augmented Generation (RAG), and fine-tuning of Large Language Models (LLMs) using frameworks like LangChain, LangGraph, and Hugging Face. Implement cloud migration strategies for monolithic systems to microservices/serverless architectures using AWS, Azure, and GCP. Lead development of document automation systems leveraging models such as BART, LayoutLM, and Agentic AI workflows. Architect and optimize data lakes, ETL pipelines, and analytics dashboards using Databricks, PySpark, Kibana, and MLOps tools. Build centralized search engines using ElasticSearch, Solr, and Neo4j for intelligent content discovery and sentiment analysis. Ensure application and ML pipeline security with tools like SonarQube, WebInspect, and container security tools. Collaborate with InfoSec and DevOps teams to maintain CI/CD pipelines, perform vulnerability analysis, and ensure compliance. Guide modernization initiatives across app stacks and coordinate BCDR-compliant infrastructures for mission-critical services. Provide technical leadership and mentoring to engineering teams during all phases of the SDLC. Hands-on experience with: Generative AI, LLMs, Prompt Engineering, LangChain, AutoGen, Vertex AI, AWS Bedrock Python, Java (Spring Boot, Spring AI), PyTorch Vector & Graph Databases: ElasticSearch, Solr, Neo4j Cloud Platforms: AWS, Azure, GCP (CAF, serverless, containerization) DevSecOps: SonarQube, OWASP, oAuth2, container security Strong background in application modernization, cloud-native architecture, and MLOps orchestration. Familiarity with front-end technologies: HTML, JavaScript, React, JQuery. Certifications Any certification on AI/ML from reputed institute Required Skills & Qualifications Bachelor's degree in Computer Science, Engineering, or Mathematics 10+ years of total experience, with extensive tenure as a Solution Architect in AI and cloud-driven transformations. Advanced knowledge of leading architecture solutions in the industry area Strong interpersonal and collaboration skills Ability to demonstrate technical concepts to non-technical audiences Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Role Description This is a contract remote role for a Senior Graph Data Engineer. The Senior Graph Data Engineer will be responsible for designing, developing, and maintaining graph database solutions, creating data models, and implementing ETL processes. The role includes data warehousing tasks and data analytics to support decision-making processes. Collaborating with cross-functional teams, the Senior Graph Data Engineer will ensure the integrity and performance of graph databases. Qualifications Proficiency in Data Engineering and Data Modeling Experience with Extract Transform Load (ETL) processes Knowledge of Data Warehousing Strong Data Analytics skills Excellent problem-solving and critical-thinking skills Strong communication and teamwork abilities Bachelor's degree in Computer Science, Engineering, or related field Experience with graph databases like Neo4j is a plus Exposure to AI and Machine Learning techniques is beneficial Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Sadar, Uttar Pradesh, India

On-site

Linkedin logo

. Role Overview: We are seeking a motivated Junior AI Testing Engineer to join our team. In this role, you will support the testing of AI models and pipelines, with a special focus on data ingestion into knowledge graphs and knowledge graph administration. You will collaborate with data scientists, engineers, and product teams to ensure the quality, reliability, and performance of our AI-driven solutions. Key Responsibilities: AI Model & Pipeline Testing: Design and execute test cases for AI models and data pipelines, ensuring accuracy, stability, and fairness Knowledge Graph Ingestion: Support the development and testing of Python scripts for data extraction, transformation, and loading (ETL) into enterprise knowledge graphs Knowledge Graph Administration: Assist in maintaining, monitoring, and troubleshooting knowledge graph environments (e.g., Neo4j, RDF stores), including user access and data integrity. Test Automation: Develop and maintain basic automation scripts (preferably in Python) to streamline testing processes for AI functionalities Data Quality Assurance: Evaluate and validate the quality of input and output data for AI models, reporting and documenting issues as needed Bug Reporting & Documentation: Identify, document, and communicate bugs or issues discovered during testing. Maintain clear testing documentation and reports. Collaboration: Work closely with knowledge graph engineers, data scientists, and product managers to understand requirements and deliver robust solutions. Requirements Requirements: Education: Bachelor’s degree in Computer Science, Information Technology, or related field. Experience: ideally experience in software/AI testing, data engineering, or a similar technical role. Technical Skills: Proficient in Python (must have) Experience with test case design, execution, and bug reporting Exposure to knowledge graph technologies (e.g., Neo4j, RDF, SPARQL) and data ingestion/ETL processes Analytical & Problem-Solving Skills: Strong attention to detail, ability to analyze data and systems, and troubleshoot issues Communication: Clear verbal and written communication skills for documentation and collaboration. Preferred Qualifications: Experience with graph query languages (e.g., Cypher, SPARQL) Exposure to cloud platforms (AWS, Azure, GCP) and CI/CD workflows Familiarity with data quality and governance practices. Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 4 weeks ago

Apply

2 - 5 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

As an AI Product Manager in GenAI Core team you will help shape the future of our Generative AI platform—with a strong focus on GenAI and agentic capabilities. This isn’t a traditional AI/ML development role. Instead, it’s about transforming ideas into impactful GenAI product features—particularly in the space of autonomous agents, multi-agent collaboration, and LLM-driven automation. Duties And Responsibilities Translate stakeholder needs into scalable, implementable features in the GenAI platform. Collaborate across teams to build next-gen GenAI experiences Continuously refine and optimize the product based on feedback and trends such as agentic and multiagent capabilities. Create and maintain product feature backlog, user guides. Refine the backlog features into implementation Epic, user stories, artefacts. Provide prompt and effective support and guidance to developers. Ensure the quality of implementation and deliverables. Collaborate with the engineering team to address and resolve product issues. Stay abreast of the latest industry trends and technologies. Assume overall responsibility for the product management in respective projects. Demonstrate a strong desire for continuous learning and the ability to quickly adapt and implement new technologies. Skills Qualification, Experience, Technical and Functional Skills Bachelor's degree in Computer Science, Information Technology, or a related field with 5+ years of working experience. 3–6 years in product management (preferably in AI/tech platforms) Experience in GenAI product development and agent-based design Strong understanding of LLMs, prompt engineering, and agent orchestration Proficiency in implementing Generative AI-based applications using different large language models. Understanding & experience in various generative AI models on cloud platforms such as Azure/ AWS, including Retrieval Augmented Generation, Prompt engineering, Agentic frameworks. Proficiency in cloud platforms like Azure or AWS, familiarity with deploying and managing AI applications, handling cloud storage, compute instances, and cloud security. Familiarity with in Angular or similar JavaScript frameworks for building user interfaces, along with a solid understanding of HTML, CSS, and JavaScript. Familiarity with database technologies like Postgres, SQL, NoSQL, MongoDB, and vector databases such as pgvector, neo4j. Understanding of security principles as they apply to AI applications, especially in a cloud environment. Experience with collaboration and versioning tools such as JIRA, Confluence, GitHub. Excellent problem-solving skills and the ability to debug code effectively. Ability to quickly learn and apply new technologies. 72622 | Customer Services & Claims | Professional | Allianz Technology | Full-Time | Permanent Warning: When posting this job advertisment on an external job board, the length of the following fields combined must not exceed 3950 characters: "External Posting Description", "External Posting Footer" We offer a hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad We believe in rewarding performance and our compensation and benefits package includes a company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location) From career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered Flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality, religion, disability, or philosophy of life. Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

We are seeking a skilled, motivated, and quick-learner Full Stack Developer to join our team working on cutting-edge Gen AI development work. The successful candidate will be responsible for developing innovative applications and solutions using including frontend and backend. While the solutions will often utilize Retrieval Augmented Generation (RAG), Agentic frameworks the role will not be limited to this, and will involve various AI technologies. Duties And Responsibilities Develop and maintain web applications using Angular, NDBX frameworks, and other modern technologies. Design and implement databases in Postgres DB, apply & implement ingestion and retrieval pipelines using pgvector, neo4j, ensuring efficient and secure data practices. Use different generative AI models & frameworks such as LangChain, Haystack, LlamIndex etc for chucking, embeddings, chat completions, integration with different data sources etc. Familiarity and experience with different agentic frameworks and technique like Langgraph, AutoGen, CrewAI, tool using techniques like MCP (Model Context Protocol). Use Azure & AWS cloud platforms in implementation to stay relevant to company AI guidelines requirements. Usage of OpenAPI standards, API first approach to develop APIs for communication between different software components. Collaborate with the team members to integrate various GenAI capabilities into the applications, including but not limited to RAG. Write clean, maintainable, and efficient code that adheres to company standards. Conduct testing to identify and fix bugs or vulnerabilities. Use collaboration and versioning tools such as GitHub for effective team working and code management. Stay updated with emerging technologies and apply them into operations and activities. Show a strong desire for continuous learning and the ability to quickly adapt and implement new technologies Skills Qualification, Experience, Technical and Functional Skills Bachelor's degree in Computer Science, Information Technology, or a related field with 6+ years of working experience. Proven experience as a Full Stack Developer or similar role in designing, developing and deploying end to end applications. Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery). Experience with Angular and NDBX frameworks. Good experience with database technology such as Postgres DB, vector databases. Experience developing APIs following the OpenAPI standards. Understanding & experience in various generative AI models on cloud platforms such as Azure/ AWS, including Retrieval Augmented Generation, Prompt engineering, Agentic RAG, Agentic frameworks, Model context protocols etc. Experience with collaboration and versioning tools such as GitHub Experience with docker images, containers to package up an application with all the parts it needs, such as libraries and other dependencies, and ship it all out as one package 72617 | IT & Tech Engineering | Professional | Allianz Technology | Full-Time | Permanent Warning: When posting this job advertisment on an external job board, the length of the following fields combined must not exceed 3950 characters: "External Posting Description", "External Posting Footer" We offer a hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad We believe in rewarding performance and our compensation and benefits package includes a company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location) From career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered Flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality, religion, disability, or philosophy of life. Show more Show less

Posted 4 weeks ago

Apply

3 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Exp: 6 - 14 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills:Python, pyspark, Azure Data Factory, snowflake, snowpipe, snowsql, Snowsight, Snowpark, ETL and SQL. Snowpro certified is plus Architect Exp Mandatory Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: python,snowpro,sql,azure data factory,azure,snowpipe,neo4j,skills,data engineering,snowflake,terraform,nosql,circleci,git,data management,unix shell scripting,pl/sql,data warehouse,data bricks,pipelines,cassandra,snowsql,architects,rdbms,databricks,data,projects,pyspark,adf,snowsight,etl,snowpark,mongodb Show more Show less

Posted 4 weeks ago

Apply

8 - 12 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,azure synapses,pl/sql,skills,nosql,git,terraform,apache kafka,unix shell scripting,hadoop,pyspark,architects,azure datafactory,azure data factory,circleci,azure functions,architect designing,data warehouse,python,sql,pipelines,etl,data engineering,azure synapse,data pipeline,data warehousing,rdbms,azure databricks,azure,airflow Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 4 weeks ago

Apply

8 - 12 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,azure synapses,pl/sql,skills,nosql,git,terraform,apache kafka,unix shell scripting,hadoop,pyspark,architects,azure datafactory,azure data factory,circleci,azure functions,architect designing,data warehouse,python,sql,pipelines,etl,data engineering,azure synapse,data pipeline,data warehousing,rdbms,azure databricks,azure,airflow Show more Show less

Posted 4 weeks ago

Apply

7 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Experience: 7+ Years Extensive Experience with Azure cloud platform Good Experience in maintaining cost-efficient, scalable cloud environments for the organization involving best practices for monitoring and cloud governance Experience with CI tools like Jenkins and building end to end CI/CD pipelines for projects Experience with various build tools like Maven/Ant/Gradle Rich Experience with container frameworks like Docker, Kubernetes or cloud native container services Good Experience in Infrastructure as a Code (IaC) using tools like Terraform Good Experience with anyone CM tools of following: Ansible, Chef, Saltstack, Puppet Good Experience in monitoring tools like Prometheus & Grafana, Nagios/ DataDog/Zabbix and logging tools like Splunk/LogStash Good Experience in scripting and automation using languages like Bash/Shell, Python, PowerShell, Groovy, Perl. Configure and manage data sources like MySQL, Mongo, Elasticsearch, Redis, Cassandra, Hadoop, PostgreSQL, Neo4J etc Good experience on managing version control tool like Git, SVN/BitBucket Good problem-solving ability, strong written and verbal communication skills RESPONSIBILITIES: Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Position Overview ABOUT APOLLO Apollo is a high-growth, global alternative asset manager. In our asset management business, we seek to provide our clients excess return at every point along the risk-reward spectrum from investment grade to private equity with a focus on three investing strategies: yield, hybrid, and equity. For more than three decades, our investing expertise across our fully integrated platform has served the financial return needs of our clients and provided businesses with innovative capital solutions for growth. Through Athene, our retirement services business, we specialize in helping clients achieve financial security by providing a suite of retirement savings products and acting as a solutions provider to institutions. Our patient, creative, and knowledgeable approach to investing aligns our clients, businesses we invest in, our employees, and the communities we impact, to expand opportunity and achieve positive outcomes. OUR PURPOSE AND CORE VALUES Our Clients Rely On Our Investment Acumen To Help Secure Their Future. We Must Never Lose Our Focus And Determination To Be The Best Investors And Most Trusted Partners On Their Behalf. We Strive To Be The leading provider of retirement income solutions to institutions, companies, and individuals. The leading provider of capital solutions to companies. Our breadth and scale enable us to deliver capital for even the largest projects – and our small firm mindset ensures we will be a thoughtful and dedicated partner to these organizations. We are committed to helping them build stronger businesses. A leading contributor to addressing some of the biggest issues facing the world today – such as energy transition, accelerating the adoption of new technologies, and social impact – where innovative approaches to investing can make a positive difference. We are building a unique firm of extraordinary colleagues who: Outperform expectations. Challenge Convention Champion Opportunity Lead responsibly. Drive collaboration As One Apollo team, we believe that doing great work and having fun go hand in hand, and we are proud of what we can achieve together. Our Benefits Apollo relies on its people to keep it a leader in alternative investment management, and the firm’s benefit programs are crafted to offer meaningful coverage for both you and your family. Please reach out to your Human Capital Business Partner for more detailed information on specific benefits. Position Overview At Apollo, we are a global team of alternative investment managers passionate about delivering uncommon value to our investors and shareholders. With over 30 years of proven expertise across Private Equity, Credit, and Real Assets in various regions and industries, we are known for our integrated businesses, our strong investment performance, our value-oriented philosophy, and our people. We seek a Senior Engineer/Full Stack Developer to innovate, manage, direct, architect, design, and implement solutions focused on our trade operations and controller functions across Private Equity, Credit, and Real Assets. The ideal candidate is a well-rounded hands-on engineer passionate about delivering quality software on the Java stack. Our Senior Engineer will work closely with key stakeholders in our Middle Office and Controllers teams and in the Credit and Opportunistic Technology teams to successfully deliver business requirements, projects, and programs. The candidate will have proven skills in independently managing the full software development lifecycle, working with end-users, business analysts, and project managers in defining and refining the problem statement, and delivering quality solutions on time. They will have the aptitude to quickly learn and embrace emerging technologies and proven methodologies to innovate and improve the correctness, quality, and timeliness of solutions delivered by the team. Primary Responsibilities Contribute to development of elegant solutions for systems that result in simple, extensible, maintainable, high-quality code. Participate in design discussions, hands-on technical, development, code reviews, quality assurance, observability, and product support. Use technical knowledge of patterns and code to identify risks and prevent software defects. Foster a culture of collaboration, disciplined software engineering practices, and a mindset to leave things better than you found them. Optimize team processes to improve productivity and responsiveness to feedback and changing priorities. Build strong relationships with key stakeholders, collaborate, and communicate effectively to reach successful outcomes. Passionate about delivering high-impact and breakthrough value to stakeholders. Desire to learn the domain and deliver enterprise solutions with at a higher velocity. Contribute to deliverables from early stages of requirement gathering through development, testing, UAT, deployment and post-production Lead in the planning, execution, and delivery of the team’s commitments. Qualifications & Experience Master’s or bachelor’s degree in Computer Science or another STEM field Experience with software development in the Alternative Asset Management or Investment Banking domain 5+ years of software development experience in at least one of the following OO languages: Java, C++, or C# 3+ years of Web 2.0 UI/UX development experience in at least one of the following frameworks using JavaScript/TypeScript: ExtJS, ReactJS, AngularJS, or Vue. Hands-on development expertise in Java, Spring Boot, REST, Messaging, JPA, and SQL for the last 2+ years Hands-on development expertise in building applications using RESTful and Microservices architecture Expertise in developing applications using TDD/BDD/ATDD with hands-on experience with at least one of Junit, Spring Test, TestNG, or Cucumber A strong understanding of SOLID principles, Design Patterns, Enterprise Integration Patterns A strong understanding of relational databases, SQL, ER modeling, and ORM technologies A strong understanding of BPM and its application Hands-on experience with various CI/CD practices and tools such as Jenkins, Azure DevOps, TeamCity, etcetera Exceptional problem-solving & debugging skills. Awareness of emerging application development methodologies, design patterns, and technologies. Ability to quickly learn new and emerging technologies and adopt solutions from within the company or the open-source community. Experience with the below will be a plus Buy-side operational and fund accounting processes Business processes and workflows using modern BPM/Low Code/No Code platforms (JBPM, Bonitasoft, Appian, Logic Apps, Unqork, etcetera…) OpenAPI, GraphQL, gRPC, ESB, SOAP, WCF, Kafka, and Node Serverless architecture Microsoft Azure Designing and implementing microservices on AKS Azure DevOps Sencha platform NoSQL databases (MongoDB, Cosmos DB, Neo4J) Python software development Functional programming paradigm Apollo provides equal employment opportunities regardless of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, color, nationality, ethnic or national origin, religion or belief, veteran status, gender/sex or sexual orientation, or any other criterion or circumstance protected by applicable law, ordinance, or regulation. The above criteria are intended to be used as a guide only – candidates who do not meet all the above criteria may still be considered if they are deemed to have relevant experience/ equivalent levels of skill or knowledge to fulfil the requirements of the role. Any job offer will be conditional upon and subject to satisfactory reference and background screening checks, all necessary corporate and regulatory approvals or certifications as required from time to time and entering into definitive contractual documentation satisfactory to Apollo. Show more Show less

Posted 1 month ago

Apply

5 - 9 years

7 - 11 Lacs

Kochi, Coimbatore, Thiruvananthapuram

Work from Office

Naukri logo

Job Title - Senior Data Engineer (Graph DB specialist)+ Specialist + Global Song Management Level :9,Specialist Location:Kochi, Coimbatore Must have skills: Data Modeling Techniques and Methodologies Good to have skills:Proficiency in Python and PySpark programming. Job Summary :We are seeking a highly skilled Data Engineer with expertise in graph databases to join our dynamic team. The ideal candidate will have a strong background in data engineering, graph querying languages, and data modeling, with a keen interest in leveraging cutting-edge technologies like vector databases and LLMs to drive functional objectives. Your responsibilities will include: Design, implement, and maintain ETL pipelines to prepare data for graph-based structures. Develop and optimize graph database solutions using querying languages such as Cypher, SPARQL, or GQL. Neo4J DB experience is preferred. Build and maintain ontologies and knowledge graphs, ensuring efficient and scalable data modeling. Integrate vector databases and implement similarity search techniques, with a focus on Retrieval-Augmented Generation (RAG) methodologies and GraphRAG. Collaborate with data scientists and engineers to operationalize machine learning models and integrate with graph databases. Work with Large Language Models (LLMs) to achieve functional and business objectives. Ensure data quality, integrity, and security while delivering robust and scalable solutions. Communicate effectively with stakeholders to understand business requirements and deliver solutions that meet objectives. Roles & Responsibilities: Experience:At least 5 years of hands-on experience in data engineering. With 2 years of experience working with Graph DB. Programming: Querying:Advanced knowledge of Cypher, SPARQL, or GQL querying languages. ETL Processes:Expertise in designing and optimizing ETL processes for graph structures. Data Modeling:Strong skills in creating ontologies and knowledge graphs.Presenting data for Graph RAG based solutions Vector Databases:Understanding of similarity search techniques and RAG implementations. LLMs:Experience working with Large Language Models for functional objectives. Communication:Excellent verbal and written communication skills. Cloud Platforms:Experience with Azure analytics platforms, including Function Apps, Logic Apps, and Azure Data Lake Storage (ADLS). Graph Analytics:Familiarity with graph algorithms and analytics. Agile Methodology:Hands-on experience working in Agile teams and processes. Machine Learning:Understanding of machine learning models and their implementation. Professional & Technical Skills: Additional Information: (do not remove the hyperlink) Qualifications Experience: Minimum 5-10 year(s) of experience is required Educational Qualification: Any graduation / BE / B Tech

Posted 1 month ago

Apply

Exploring Neo4j Jobs in India

Neo4j, a popular graph database management system, is seeing a growing demand in the job market in India. Companies are looking for professionals who are skilled in working with Neo4j to manage and analyze complex relationships in their data. If you are a job seeker interested in Neo4j roles, this article will provide you with valuable insights to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Delhi/NCR

Average Salary Range

The average salary range for Neo4j professionals in India varies based on experience levels. - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

In the Neo4j skill area, a typical career progression may look like: - Junior Developer - Developer - Senior Developer - Tech Lead

Related Skills

Apart from expertise in Neo4j, professionals in this field are often expected to have or develop skills in: - Cypher Query Language - Data modeling - Database management - Java or Python programming

Interview Questions

  • What is a graph database? (basic)
  • Explain the difference between a graph database and a traditional relational database. (basic)
  • How does Neo4j handle relationships between nodes? (medium)
  • What is Cypher Query Language? (basic)
  • Can you give an example of a Cypher query to retrieve all nodes connected to a specific node? (medium)
  • How does Neo4j ensure data consistency in a distributed environment? (advanced)
  • What are the benefits of using Neo4j for social network analysis? (medium)
  • Explain the concept of indexing in Neo4j. (medium)
  • How does Neo4j handle transactions? (medium)
  • Can you explain the concept of graph traversal in Neo4j? (medium)
  • What are some common use cases for Neo4j in real-world applications? (medium)
  • How does Neo4j handle scalability? (advanced)
  • What is the significance of property graphs in Neo4j? (basic)
  • Explain the concept of cardinality in Neo4j. (medium)
  • How can you optimize Neo4j queries for better performance? (medium)
  • What are the key components of a Neo4j graph database? (basic)
  • How does Neo4j support ACID properties? (medium)
  • What are the limitations of Neo4j? (medium)
  • Can you explain the concept of graph algorithms in Neo4j? (medium)
  • How does Neo4j handle data import/export? (medium)
  • Explain the concept of labels and relationship types in Neo4j. (basic)
  • What are the different types of indexes supported by Neo4j? (medium)
  • How does Neo4j handle security and access control? (medium)
  • What are the advantages of using Neo4j over other graph databases? (medium)
  • How can you monitor and troubleshoot performance issues in Neo4j? (medium)

Conclusion

As you explore Neo4j job opportunities in India, it's essential to not only possess the necessary technical skills but also be prepared to showcase your expertise during interviews. Stay updated with the latest trends in Neo4j and continuously enhance your skills to stand out in the competitive job market. Prepare thoroughly, demonstrate your knowledge confidently, and land your dream Neo4j job in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies