Jobs
Interviews

638 Neo4J Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

18 - 22 years

0 Lacs

Hyderabad, Telangana, India

Hybrid

DATAECONOMY is one of the fastest-growing Data & AI company with global presence. We are well-differentiated and are known for our Thought leadership, out-of-the-box products, cutting-edge solutions, accelerators, innovative use cases, and cost-effective service offerings. We offer products and solutions in Cloud, Data Engineering, Data Governance, AI/ML, DevOps and Blockchain to large corporates across the globe. Strategic Partners with AWS, Collibra, cloudera, neo4j, DataRobot, Global IDs, tableau, MuleSoft and Talend. Job Title: Delivery HeadExperience: 18 - 22 YearsLocation: HyderabadNotice Period: Immediate Joiners are preferred Job Summary:We are seeking a seasoned Technical Delivery Manager with deep expertise in Data Engineering and Data Science to lead complex data initiatives and drive successful delivery across cross-functional teams. The ideal candidate brings a blend of strategic thinking, technical leadership, and project execution skills, along with hands-on knowledge of modern data platforms, machine learning, and analytics frameworks. Key Responsibilities:Program & Delivery ManagementOversee end-to-end delivery of large-scale data programs, ensuring alignment with business goals, timelines, and quality standards.Manage cross-functional project teams including data engineers, data scientists, analysts, and DevOps personnel.Ensure agile delivery through structured sprint planning, backlog grooming, and iterative delivery.Technical LeadershipProvide architectural guidance and review of data engineering pipelines and machine learning models.Evaluate and recommend modern data platforms (e.g., Snowflake, Databricks, Azure Data Services, AWS Redshift, GCP BigQuery).Ensure best practices in data governance, quality, and compliance (e.g., GDPR, HIPAA).Stakeholder & Client ManagementAct as the primary point of contact for technical discussions with clients, business stakeholders, and executive leadership.Translate complex data requirements into actionable project plans.Present technical roadmaps and delivery status to stakeholders and C-level executives.Team Development & MentoringLead, mentor, and grow a high-performing team of data professionals.Conduct code and design reviews; promote innovation and continuous improvement. Key Skills and Qualifications:Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field.18–22 years of total IT experience with at least 8–10 years in data engineering, analytics, or data science.Proven experience delivering enterprise-scale data platforms, including:ETL/ELT pipelines using tools like Apache Spark, Airflow, Kafka, Talend, or Informatica.Data warehouse and lake architectures (e.g., Snowflake, Azure Synapse, AWS Redshift, Delta Lake).Machine Learning lifecycle management (e.g., model training, deployment, MLOps using MLflow, SageMaker, or Vertex AI).Strong knowledge of cloud platforms (Azure, AWS, or GCP).Deep understanding of Agile, Scrum, and DevOps principles.Excellent problem-solving, communication, and leadership skills. Preferred Certifications (Optional but Beneficial):PMP, SAFe Agile, or similar project management certifications.Certifications in cloud platforms (e.g., AWS Certified Data Analytics, Azure Data Engineer Associate).Certified Scrum Master (CSM) or equivalent.

Posted 2 months ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

Remote Contract Role - Full Stack Developer (AWS, Node.js, Python, Terraform) Location: Remote (Offshore) Contract Type: Day rate / Contract A UK-based cyber consultancy is seeking an experienced Full Stack Developer for an offshore contract role. You'll help build a next-gen SaaS platform processing IoT edge data - all within a secure, serverless AWS environment. Key Skills:AWS (Lambda, API Gateway, S3, IAM, CodePipeline)Node.js & Python - strong backend/API developmentTerraform - modular IaC, state managementCI/CD & DevOps - build, deploy, secureGraph databases (Neo4j/AuraDB), Cypher queriesSecurity-first mindset - SAST, DAST, OWASP, etc. Nice to Have:IoT, event-driven design (SNS/SQS), CognitoDocker/Fargate, AWS certifications Join a collaborative team pushing the boundaries of secure SaaS and IoT solutions. Apply now to be part of something innovative!

Posted 2 months ago

Apply

12 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About the Company We are Mindsprint! A leading-edge technology and business services firm that provides impact driven solutions to businesses, enabling them outpace speed of change. For over three decades we have been accelerating technology transformation for the Olam Group and their large base of global clients. Working with leading technologies and empowered with the freedom to create new solutions and better existing ones, we have been inspiring businesses with pioneering initiatives. Awards bagged in the recent years: Great Place To Work® Certified™ for 2023-2024Best Shared Services in India Award by Shared Services Forum – 2019Asia’s No.1 Shared Services in Process Improvement and Value Creation by Shared Services and Outsourcing Network Forum – 2019International Innovation Award for Best Services and Solutions – 2019Kincentric Best Employer India – 2020Creative Talent Management Impact Award – SSON Impact Awards 2021The Economic Times Best Workplaces for Women – 2021 & 2022#SSFExcellenceAward for Delivering Business Impact through Innovative People Practices – 2022 For more info: https://www.mindsprint.org/ Follow us in LinkedIn: Mindsprint Position : Associate Director Responsibilities Lead, mentor, and manage the Data Architects, Apps DBA, and DB Operations teams.Possess strong experience and deep understanding of major RDBMS, NoSQL, and Big Data technologies, with expertise in system design and advanced troubleshooting in high-pressure production environments.Core technologies include SQL Server, PostgreSQL, MySQL, TigerGraph, Neo4J, Elastic Search, ETL concepts, and high-level understanding on data warehouse platforms such as Snowflake, ClickHouse, etc.Define, validate, and implement robust data models and database solutions for clients across sectors such as Agriculture, Supply Chain, and Life Sciences.Oversee end-to-end database resource provisioning in the cloud, primarily on Azure, covering IaaS, PaaS, and SaaS models, along with proactive cost management and optimization.Hands-on expertise in data migration strategies between on-premises and cloud environments, ensuring minimal downtime and secure transitions.Experienced in database performance tuning, identifying and resolving SQL code bottlenecks, code review, optimization for high throughput, and regular database maintenance including defragmentation.Solid understanding of High Availability (HA) and Disaster Recovery (DR) solutions, with experience in setting up failover setup, replication, backup, and recovery strategies.Expertise in implementing secure data protection measures such as encryption (at rest and in transit), data masking, access controls, DLP strategies, and ensuring regulatory compliance with GDPR, PII, PCI-DSS, HIPAA, etc.Skilled in managing data integration, data movement, and data report pipelines using tools like Azure Data Factory (ADF), Apache NiFi, and Talend.Fair understanding of database internals, storage engines, indexing strategies, and partitioning for optimal resource and performance management.Strong knowledge in Master Data Management (MDM), data cataloging, metadata management, and building comprehensive data lineage frameworks.Proven experience in implementing monitoring and alerting systems for database health and capacity planning using tools like Azure Monitor, Grafana, or custom scripts.Exposure to DevOps practices for database management, including CI/CD pipelines for database deployments, version control of database schemas, and Infrastructure as Code (IaC) practices (e.g., Terraform, ARM templates).Experience collaborating with data analytics teams to provision optimized environments as data’s are shared between RDBMS, NoSQL and Snowflake Layers.Knowledge of security best practices for multi-tenant database environments and data segmentation strategies.Ability to guide the evolution of data governance frameworks, defining policies, standards, and best practices for database environments. Job Location : ChennaiNotice period :15 Days / Immediate / Currently Serving Notice period - Max 30 DaysShift : Day ShiftExperience : Min 12 YearsWork Mode : HybridGrade : D1 Associate Director

Posted 2 months ago

Apply

4 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Mid-Level Data Scientist / AI-ML EngineerLocation: On-site, HyderabadExperience: 4+ years of relevant experience Job Summary: We are looking for Data Scientist to join our team in Hyderabad. This role is ideal for candidates with 4+ year of experience with strong ML, GenAI and cloud skills. Must Have Skills:Experience with GenAIworking with LLMs like Llama, Mistral, Deep seek and GPTsopen source GenAI tools like flowise/langflowKnowledge graphs like Neo4j, LangChainSolid grasp of machine learning algorithms and Model deploymentRestAPIs, microservices (eg: FastAPI, Flask and Streamlit)Experience with AWS, GCP, Azure (S3, Lambda, Sage Maker, Bedrock or Vertex AI)Containerization (Docker) and CI/CD basicsSQL + NoSQL databasesData preprocessing & ETL pipelinesAgentic AI, MLOps, Prompt engineering and RAG pipelinesGit,Vector DBs (Pinecone, FAISS and ChromaDB)Experience integrating LLMs into applications etc.

Posted 2 months ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Unified Infotech is a 14-year-old, multi-award winning digital transformation partner. We turbocharge business growth for Fortune 500 companies, multinational corporations (MNCs), small and medium-sized enterprises (SMEs), and Startups using emerging tech and streamlined digital processes.We’re your go-to partner for:· Digital Transformation, Custom Web, Mobile, and Desktop Software Development· Digital Customer Experience - UX/UI Research & Design· SaaS and Software Product Development· IT Consulting & Staff Augmentation· Software Modernization & Cloud Migration· Data and Analytics· Cloud EngineeringYou can get more details about us from our website www.unifiedinfotech.net Position Overview We are looking for a highly skilled and experienced Solution Architect to join our team.This role is responsible for delivering both technical and functional expertise to clientsacross various projects. The ideal candidate will have a strong background in designing,implementing, and optimizing scalable and highly available cloud (SaaS) services andsolutions. This role involves collaborating closely with business development, accountmanagement, and executive leadership teams to ensure that technical solutions alignwith business goals and are implemented seamlessly. Key Responsibilities • Solution Design & Development: Analyze client requirements and functionalspecifications, and collaborate with development teams to design and implementscalable, distributed cloud-based solutions (SaaS).• Cloud Architecture: Lead the design and implementation of highly available,resilient, and efficient cloud architectures. Build complex distributed systems fromthe ground up with a focus on minimizing downtime, ensuring failproofdeployments, and maintaining data integrity.• Stakeholder Collaboration: Work closely with business development, accountmanagers, and executive management to align technical solutions with businessgoals and increase overall company productivity and profitability.• Database Expertise: Provide expertise in SQL and NoSQL databases such asMySQL, Oracle, MongoDB, Cassandra, Redis, and Neo4J to design efficient datamodels and schemas.• Continuous Improvement: Evaluate and recommend improvements to currenttechnologies and processes within the organization to drive greater efficiency andperformance.• Mentorship & Best Practices: Mentor development teams by guiding them in bestpractices for coding, architecture design, and software developmentmethodologies.• Version Control & CI/CD: Implement and manage version control systems (e.g.,Git) and Continuous Integration/Continuous Deployment (CI/CD) pipelines toensure smooth, efficient development workflows.• Security & Compliance: Ensure that all solutions adhere to security best practicesand comply with relevant standards to protect data and systems.• Agile Methodology: Participate in Agile/Scrum development processes,collaborating with cross-functional teams to deliver high-quality solutions on time.• Strategic Planning: Contribute to long-term architectural strategy, identifying areasfor improvement and ensuring solutions meet business requirements andperformance goals. Desired Candidate Profile • Experience: Proven experience in solution architecture, design, andimplementation of scalable cloud-based solutions (SaaS). Hands-on experiencewith high availability and distributed systems is essential.• Technical Skills: o Strong proficiency in SQL and NoSQL databases (e.g., MySQL, MongoDB,Cassandra, Neo4J, Redis).o Expertise in cloud architectures, distributed systems, and high-performancecomputing.o Proficient in version control systems, particularly Git.o Familiarity with CI/CD processes and pipeline automation.o Understanding of web application security principles.• Programming & Frameworks: Experience with technologies and frameworks suchas NodeJS, Laravel, Spring, Angular, React, or similar frameworks is highly desirable.• Leadership & Mentorship: Strong ability to mentor and guide technical teams inadopting best practices and delivering high-quality solutions.• Methodology: Practical experience in Agile/Scrum development methodologieswith a collaborative approach to team success.• Communication: Excellent communication skills, with the ability to effectivelypresent complex technical concepts to both technical and non-technicalstakeholders.

Posted 2 months ago

Apply

3 years

0 Lacs

Greater Kolkata Area

On-site

This role is for one of the Weekday's clients Salary range: Rs 600000 - Rs 1700000 (ie INR 6-17 LPA) Min Experience: 3 years Location: Bangalore, Chennai, pune, Kolkata, Gurugram JobType: full-time Experience: 6+ years in IT with 3+ years in Data Warehouse/ETL projects Requirements Primary Responsibilities: Design and develop modern data warehouse solutions using Snowflake, Databricks, and Azure Data Factory (ADF). Deliver forward-looking data engineering and analytics solutions that scale with business needs. Work with DW/BI leads to gather and implement requirements for new ETL pipelines. Troubleshoot and resolve issues in existing pipelines, identifying root causes and implementing fixes. Partner with business stakeholders to understand reporting requirements and build corresponding data models. Provide technical mentorship to junior team members and assist with issue resolution. Engage in technical discussions with client architects and team members to align on best practices. Orchestrate data workflows using scheduling tools like Apache Airflow. Qualifications: Bachelor's or Master's degree in Computer Science or a related field. Expertise in Snowflake, including security, SQL, and object design/implementation. Proficient with Snowflake tools such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors. Strong understanding of Star and Snowflake schema modeling. Deep knowledge of data management principles and data warehousing. Experience with Databricks and a solid grasp of Delta Lake architecture. Hands-on with SQL and Spark (preferably PySpark). Experience developing ETL processes and transformations for data warehousing solutions. Familiarity with NoSQL and open-source databases such as MongoDB, Cassandra, or Neo4J. Exposure to structured and unstructured data, including imaging and geospatial formats. Proficient in DevOps tools and practices including Terraform, CircleCI, Git. Strong background in RDBMS, PL/SQL, Unix Shell Scripting, and query performance tuning. Databricks Certified Data Engineer Associate/Professional certification is a plus. Ability to thrive in a fast-paced, dynamic environment managing multiple projects. Experience working within Agile development frameworks. Excellent communication, analytical, and problem-solving skills with strong attention to detail. Mandatory Skills: Snowflake, Azure Data Factory, PySpark, Databricks, SQL, Python

Posted 2 months ago

Apply

3 - 8 years

6 - 10 Lacs

Chennai

Work from Office

Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Requirements Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.

Posted 2 months ago

Apply

2 - 7 years

4 - 8 Lacs

Chennai

Work from Office

Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Requirements Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.

Posted 2 months ago

Apply

6 - 11 years

6 - 9 Lacs

Chennai

Work from Office

Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Key Responsibilities: Meta Data Modeling: Develop and implement meta data models that represent complex data structures and relationships across the system. Collaborate with cross-functional teams to design flexible, efficient, and scalable meta data models to support application and data processing requirements. Software Development (Java & Spring Boot): Develop high-quality, efficient, and scalable Java applications using Spring Boot and other Java-based frameworks. Participate in full software development lifecycledesign, coding, testing, deployment, and maintenance. Optimize Java applications for performance and scalability. UI Development (Angular)(Optional) Design and implement dynamic, responsive, and user-friendly web UIs using Angular. Integrate the UI with backend microservices, ensuring a seamless and efficient user experience. Ensure that the UI adheres to best practices in terms of accessibility, security, and usability. Containerization & Microservices (Kubernetes): Design, develop, and deploy microservices using Kubernetes to ensure high availability and scalability of applications. Use Docker containers and Kubernetes for continuous deployment and automation of application lifecycle. Maintain and troubleshoot containerized applications in a cloud or on-premise Kubernetes environment. Requirements Database Management (Postgres & Neo4j): Design and implement database schemas and queries for both relational databases (Postgres) and graph databases (Neo4j). Develop efficient data models and support high-performance query optimization. Collaborate with the data engineering team to integrate data pipelines and ensure the integrity of data storage. Business Process Modeling (BPMN): Utilize BPMN to model business processes and workflows. Design and optimize process flows to improve operational efficiency. Work with stakeholders to understand business requirements and implement process automation. Rule Engine (Drools Rules): Implement business logic using the Drools Rules Engine to automate decision-making processes. Work with stakeholders to design and define business rules and integrate them into applications. Ingestion Framework: Build and maintain robust data ingestion frameworks that process large volumes of data efficiently. Ensure proper data validation, cleansing, and enrichment during the ingestion process.

Posted 2 months ago

Apply

7 - 10 years

0 Lacs

Bengaluru, Karnataka

Work from Office

About Us: Data Scientist – 3 – Kotak811 Kotak811 is a Neobank incubated by Kotak Mahindra Bank, with a view of providing completely digitized banking services in the convenience of the customer’s mobile phone. 811 is an early mover in the Indian fintech space that started off as a downloadable savings bank account in 2017, post demonetization, when India took one step closer to a digital economy. The Data Scientist-3 in Bangalore (or Mumbai) will be part of the 811 Data Strategy Group that comprises Data Engineers, Data Scientists and Data Analytics professionals. He/she will be associated with one of the key functional areas such as Product Strategy, Cross Sell, Asset Risk, Fraud Risk, Customer Experience etc. and help build robust and scalable solutions that are deployed for real time or near real time consumption and integrated into our proprietary Customer Data Platform (CDP). This is an exciting opportunity to work on data driven analytical solutions and have a profound influence on the growth trajectory of a super fast evolving digital product. Key Requirements of The Role Advanced degree in an analytical field (e.g., Data Science, Computer Science, Engineering, Applied Mathematics, Statistics, Data Analysis) or substantial hands on work experience in the space 7 - 10 Years of relevant experience in the space Expertise in mining AI/ML opportunities from open ended business problems and drive solution design/development while closely collaborating with engineering, product and business teams Strong understanding of advanced data mining techniques, curating, processing and transforming data to produce sound datasets. Strong experience in NLP, time series forecasting and recommendation engines preferred Create great data stories with expertise in robust EDA and statistical inference. Should have at least a foundational understanding in Experimentation design Strong understanding of the Machine Learning lifecycle - feature engineering, training, validation, scaling, deployment, scoring, monitoring, and feedback loop. Exposure to Deep Learning applications and tools like TensorFlow, Theano, Torch, Caffe preferred Experience with analytical programming languages, tools and libraries (Python a must) as well as Shell scripting. Should be proficient in developing production ready code as per best practices. Experience in using Scala/Java/Go based libraries a big plus Very proficient is SQL and other relational databases along with PySpark or Spark SQL. Proficient is using NoSQL databases. Experience in using GraphDBs like Neo4j a plus. Candidate should be able to handle unstructured data with ease. Candidate should have experience in working with MLEs and be proficient (with experience) in using MLOps tools. Should be able to consume the capabilities of said tools with deep understanding of deployment lifecycle. Experience in CI/CD deployment is a big plus. Knowledge of key concepts in distributed systems like replication, serialization, concurrency control etc. a big plus Good understanding of programming best practices and building code artifacts for reuse. Should be comfortable with version controlling and collaborate comfortably in tools like git Ability to create frameworks that can perform model RCAs using analytical and interpretability tools. Should be able to peer review model documentations/code bases and find opportunities Experience in end-to-end delivery of AI driven Solutions (Deep learning , traditional data science projects) Strong communication, partnership and teamwork skills Should be able to guide and mentor teams while leading them by example. Should be an integral part of creating a team culture focused on driving collaboration, technical expertise and partnerships with other teams Ability to work in an extremely fast paced environment, meet deadlines, and perform at high standards with limited supervision A self-starter who is looking to build grounds up and contribute to the making of a potential big name in the space Experience in Banking and financial services is a plus. However, sound logical reasoning and first principles problem solving are even more critical A typical day in the life of the job role: 1. As a key partner at the table, attend key meetings with the business team to bring in the data perspective to the discussions 2. Perform comprehensive data explorations around to generate inquisitive insights and scope out the problem 3. Develop simplistic to advanced solutions to address the problem at hand. We believe in making swift (albeit sometimes marginal) impact to business KPIs and hence adopt an MVP approach to solution development 4. Build re-usable code analytical frameworks to address commonly occurring business questions 5. Perform 360-degree customer profiling and opportunity analyses to guide new product strategy. This is a nascent business and hence opportunities to guide business strategy are plenty 6. Guide team members on data science and analytics best practices to help them overcome bottlenecks and challenges 7. The role will be an approximate 60% IC – 40% leading and the ratios can vary basis need and fit 8. Develop Customer-360 Features that will be integrated into the Customer Data Platform (CDP) to enhance the single view of our customer Website: https://www.kotak811.com/

Posted 2 months ago

Apply

8 - 10 years

25 - 30 Lacs

Chennai, Pune, Delhi

Work from Office

Description: ACCOUNTABILITIES Provides database operations support, monitoring databases and backups, resolving repetitive and simple events and escalating more complex incident according to the standard operating procedures Provides production database support with specific focus on availability, capacity, performance, security and recoverability. Performs database installation and configuration, tuning, capacity planning, health checks, backups & recovery and change management according to documented stand Description Comments Additional Details Description Comments : India General Shift (IST), Need to join the global meeting during the evening IST hours.Candidate needs to be onsite at Dell Bangalore or Dell Hyderabad office location.Detailed Job Description (JD) is attached.Roles and Responsibilities (JD)Education and Experience: Bachelor s degree in computer science or a related technical discipline, or the equivalentcombination of education, technical training, or work experience. Requires 8-10 years of related experience in the design, maintenance, and administration ofNoSQL databases - Elastic Search, Neo4j, Cassandra, SingleStore, etc. Hands-on experience in Ansible automation development for NoSQL DB platform provisioning,DB installation, upgrade / patching and DBA administration activities. Deep understanding of Db cluster management, replication, and multi-datacenter configuration Strong knowledge of monitoring, management, capacity planning and compaction strategy Good knowledge of database backup and recovery, connectivity and security, and rolemanagement Ability to express complex technical concepts effectively, both verbally and in writing Ability to work well with people from many different disciplines with varying degrees oftechnical experience Must be versatile, flexible, and proactive when resolving technical issuesExcellent Interpersonal Communication SkillsProfessional Certifications: Ansible automation development, Elastic Search / Neo4j / Cassandra /Single Store certification is preferred. Not to Exceed Rate : (No Value)

Posted 3 months ago

Apply

2 - 4 years

15 - 15 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled AI/ML Engineer to join our dynamic team to build the next gen applications for our global customers. If you are a technology enthusiast and highly passionate, we are eager to discuss with you about the potential role. Responsibilities: Implement, and deploy Machine Learning solutions to solve complex problems and deliver real business value, ie. revenue, engagement, and customer satisfaction. Collaborate with data product managers, software engineers and SMEs to identify AI/ML opportunities for improving process efficiency. Develop production-grade ML models to enhance customer experience, content recommendation, content generation, and predictive analysis. Monitor and improve model performance via data enhancement, feature engineering, experimentation and online/offline evaluation. Stay up-to-date with the latest in machine learning and artificial intelligence, and influence AI/ML for the Life science industry. Stay up-to-date with the latest in machine learning and artificial intelligence, and influence AI/ML for the Life science industry. Requirements 2 - 4 years of experience in AI/ML engineering, with a track record of handling increasingly complex projects. Strong programming skills in Python, Rust. Experience with Pandas, NumPy, SciPy, OpenCV (for image processing) Experience with ML frameworks, such as scikit-learn, Tensorflow, PyTorch. Experience with GenAI tools, such as Langchain, LlamaIndex, and open source Vector DBs. Experience with one or more Graph DBs - Neo4J, ArangoDB Experience with MLOps platforms, such as Kubeflow or MLFlow. Expertise in one or more of the following AI/ML domains: Causal AI, Reinforcement Learning, Generative AI, NLP, Dimension Reduction, Computer Vision, Sequential Models. Expertise in building, deploying, measuring, and maintaining machine learning models to address real-world problems. Thorough understanding of software product development lifecycle, DevOps (build, continuous integration, deployment tools) and best practices. Excellent written and verbal communication skills and interpersonal skills. Advanced degree in Computer Science, Machine Learning or related field. Benefits We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 3 months ago

Apply

10.0 - 18.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. We’re bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Lead with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc…. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills Professional and open communication to all internal and external interfaces. Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing culture Strong analytical skills. Industry Specific Experience 10 -18 Years of experience of AI/ML project execution and AI/ML research Education – Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together We’re committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Jul 4, 2025 Unposting Date Aug 3, 2025 Reporting Manager Title Head of Data Intelligence

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies