Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
10 - 16 Lacs
Pune, Greater Noida, Delhi / NCR
Work from Office
Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Write clean, efficient, and well-documented code. Troubleshoot and fix bugs. Collaborate with other developers and stakeholders to deliver high-quality solutions. Stay up-to-date with the latest technologies and trends. Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Strong problem-solving and debugging skills. Ability to work independently and as part of a team. Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure. Share your resume at Aarushi.Shukla@coforge.com if you are an early or immediate joiner.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
Pune
Work from Office
Hi, Wishes from GSN ! Pleasure connecting with you. About the job: This is Neo4J Developer opportunity with a leading bootstrapped product company, a valued client of GSN HR. WORK LOCATION: Pune JOB ROLE: Neo4J Developer EXPERIENCE: 5+ Yrs CTC Range: 15 - 30 LPA WORK TYPE: Work from Office Job Summary: Key Responsibilities: Neo4J expertise - Proven experience with Neo4J, including its core concepts, Cypher query language and best practices Designing and implementing graph database solutions : This includes creating and maintaining graph schemas, models and architectures Familiarity with graph theory , graph data modelling and other graph database technologies Developing and optimizing Cypher queries Integrating Neo4J with BI and other systems Providing technical guidance to junior developers Creating and maintaining documentation for system architecture , design and operational processes If interested, click Apply now for IMMEDIATE response.Best, KAVIYA GSN HR | Kaviya@gsnhr.net | Google review : https://g.co/kgs/UAsF9W
Posted 3 weeks ago
4.0 - 9.0 years
10 - 19 Lacs
Pune, Greater Noida, Delhi / NCR
Work from Office
Responsibilities: Create and optimize complex SPARQL Protocol and RDF Query Language queries to retrieve and analyze data from graph databases Develop graph-based applications and models to solve real-world problems and extract valuable insights from data. Design, develop, and maintain scalable data pipelines using Python rest apis get data from different cloud platforms Create and optimize complex SPARQL queries to retrieve and analyze data from graph databases. Study and understand the nodes, edges, and properties in graphs, to represent and store data in relational databases. Mandatory Skills- Python, RDF, Neo4J, GraphDB, Version Control System, API Frameworks Qualifications: Strong proficiency in SparQL, and RDF query language Strong proficiency in Python and Rest APIs Experience with database technologies sql and sparql Preferred Skills: Knowledge of cloud platforms like AWS, Azure, or GCP. Experience with version control systems like Github. Understanding of environments and deployment processes and cloud infrastructure.
Posted 3 weeks ago
6.0 - 11.0 years
25 - 30 Lacs
Chennai
Work from Office
Seeking a GenAI Software Engineer with 5+ years’ experience to develop LLM-based applications, RAG, and AI agents. Must be skilled in LangChain, vector databases, and cloud-based AI deployment in agile teams. Required Candidate profile Experienced AI developer with expertise in GenAI, LLM apps, and RAG. Proficient in LangChain, vector DBs, and cloud AI services. Strong collaboration and agile development background preferred.
Posted 4 weeks ago
9.0 - 13.0 years
30 - 45 Lacs
Noida
Work from Office
Job Summary: The role involves designing and developing solutions to support the business needs. Optimizing and tuning existing programs and developing new routines will be an integral part of the profile. Key Responsibility Areas: Architect, Design and Develop solutions to support business requirements. Use skill sets to Analyze and manage a variety of database environments such as Oracle, Postgres, Cassandra, MySql, Graph DB, etc Provide optimal design of database environments, analysing complex distributed production deployments, and making recommendations to optimize performance. Work closely with programming teams to deliver high quality software. Provide innovative solutions to complex business and technology problems. Propose best solution in Logical and Physical Data Modelling. Perform Administration tasks including DB resource planning and DB tuning. Mentor and train junior developers, lead & manage teams. Skill Sets / Requirements: Experience of designing/architect database solutions Experience with multiple RDBMS and NoSql databases of TB data size preferably Oracle, PostgreSQL, Cassandra and Graph DB. Must be well versed in PL/SQL & PostgreSQL and Strong Query Optimization skills. Expert knowledge in DB installation, configuration, replication, upgradation, security and HADR set up. Experience in database deployment, performance and / troubleshooting issues. Knowledge of scripting languages (such and Unix, shell, PHP). Advanced knowledge of PostgreSQL will be preferred. Experience working with Cloud Platforms and Services Experience with migrating database environments from one platform to another Ability to work well under pressure Experience with big data technologies and DWH is a plus. Experience: 10+ years of experience in a Data Engineering role. Bachelors degree in Computer Science or related experience. Qualification: B.Tech. Location: Noida Sector 135
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are looking for a Svelte Developer to build lightweight, reactive web applications with excellent performance and maintainability. Key Responsibilities: Design and implement applications using Svelte and SvelteKit. Build reusable components and libraries for future use. Optimize applications for speed and responsiveness. Collaborate with design and backend teams to create cohesive solutions. Required Skills & Qualifications: 8+ years of experience with Svelte or similar reactive frameworks. Strong understanding of JavaScript, HTML, CSS, and reactive programming concepts. Familiarity with SSR and JAMstack architectures. Experience integrating RESTful APIs or GraphQL endpoints. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 1 month ago
4.0 - 8.0 years
15 - 30 Lacs
Bengaluru
Hybrid
Role & responsibilities Developing back-end code logic that leverages semantic object linking (ontologies) within Palantir Foundry Pipeline Builder, Code Workbook, and Ontology Manager. Creating servers, databases, and datasets for functionality as needed. Ensuring health of data connections and pipelines (utilizing filesystem, JDBC, SFTP, and webhook). Ensuring conformance with security protocols and markings on sensitive data sets. Ensuring responsiveness of web applications developed on low code/no code solutions. Ensuring cross-platform optimization for mobile phones. Seeing through projects from conception to finished product. Proficiency with fundamental front-end languages such as HTML, CSS, MySQL, Oracle, MongoDB and JavaScript preferred. Proficiency with server-side languages for structured data processing- Python, Py Spark, Java, Apache Spark, and Spark SQL preferred.
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Overview We are seeking a highly skilled and experienced Azure open AI Archirtect to join our growing team. You will play a key role in designing, developing, and implementing Gen AI solutions across various domains, including chatbots. The ideal candidate for this role will have experience with latest natural language processing, generative AI technologies and the ability to produce diverse content such as text, audio, images, or video. You will be responsible for integrating general-purpose AI models into our systems and ensuring they serve a variety of purposes effectively. Task and Responsibilities Collaborate with cross-functional teams to design and implement Gen AI solutions that meet business requirements. Developing, training, testing, and validating the AI system to ensure it meets the required standards and performs as intended. Design, develop, and deploy Gen AI solutions using advanced LLMs like OpenAI Models, Open Source LLMs ( Llama2 , Mistral ,"), and frameworks like Langchain and Pandas . Leverage expertise in Transformer/Neural Network models and Vector/Graph Databases to build robust and scalable AI systems. Integrate AI models into existing systems to enhance their capabilities. Creating data pipelines to ingest, process, and prepare data for analysis and modeling using Azure services, such as Azure AI Document Intelligence , Azure Databricks etc. Integrate speech-to-text functionality using Azure native services to create user-friendly interfaces for chatbots. Deploying and managing Azure services and resources using Azure DevOps or other deployment tools. Monitoring and troubleshooting deployed solutions to ensure optimal performance and reliability. Ensuring compliance with security and regulatory requirements related to AI solutions. Staying up-to-date with the latest Azure AI technologies and industry developments, and sharing knowledge and best practices with the team. Qualifications Overall 8+ years combined experience in IT and recent 5 years as AI engineer. Bachelor's or master's degree in computer science, information technology, or a related field. Experience in designing, developing, and delivering successful GenAI Solutions. Experience with Azure Cloud Platform and Azure AI services such as Azure AI Search, Azure OpenAI , Document Intelligence, Speech, Vision , etc. Experience with Azure infrastructure and solutioning. Familiarity with OpenAI Models, Open Source LLMs, and Gen AI frameworks like Langchain and Pandas . Solid understanding of Transformer/Neural Network architectures and their application in Gen AI. Hands-on experience with Vector/Graph Databases and their use in semantic & vector search. Proficiency in programming languages like Python (essential). Relevant industry certifications, such as Microsoft CertifiedAzure AI Engineer, Azure Solution Architect etc. is a plus. Excellent problem-solving, analytical, and critical thinking skills. Strong communication and collaboration skills to work effectively in a team environment. A passion for innovation and a desire to push the boundaries of what's possible with Gen AI.
Posted 1 month ago
6.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant team. We are looking for Semantic Web ETL Developer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means- trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange- discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. YOULL MAKE A DIFFERENCE BY: - Implementing innovative Products and Solution Development processes and tools by applying your expertise in the field of responsibility. JOB REQUIREMENTS/ S: - International experience with global projects and collaboration with intercultural team is preferred - 6 - 8 years experience on developing software solutions with Python language. - Experience in research and development processes (Software based solutions and products) ; in commercial topics; in implementation of strategies, POCs - Manage end-to-end development of web applications and knowledge graph projects, ensuring best practices and high code quality. - Provide technical guidance and mentorship to junior developers, fostering their growth and development. - Design scalable and efficient architectures for web applications, knowledge graphs, and database models. - Carry out code standards and perform code reviews, ensuring alignment with standard methodologies like PEP8, DRY, and SOLID principles. - Collaborate with frontend developers, DevOps teams, and database administrators to deliver cohesive solutions. - Strong and Expert-like proficiency in Python web frameworks Django, Flask, FAST API, Knowledge Graph Libraries. - Experience in designing and developing complex RESTful APIs and microservices architectures. - Strong understanding of security standard processes in web applications (e.g., authentication, authorization, and data protection). - Extensive experience in building and querying knowledge graphs using Python libraries like RDFLib, Py2neo, or similar. - Proficiency in SPARQL for advanced graph data querying. - Experience with graph databases like Neo4j, GraphDB, or Blazegraph.or AWS Neptune - Experience in expert functions like Software Development / Architecture, Software Testing (Unit Testing, Integration Testing) - Excellent in DevOps practices, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). - Excellent in Cloud technologies and architecture. Should have exposure on S3, EKS, ECR, AWS Neptune - Exposure to and working experience in the relevant Siemens sector domain (Industry, Energy, Healthcare, Infrastructure and Cities) required. LEADERSHIP QUALITIES - Visionary LeadershipAbility to lead the team towards long-term technical goals while managing immediate priorities. - Strong CommunicationGood interpersonal skills to work effectively with both technical and non-technical stakeholders. - Mentorship & CoachingFoster a culture of continuous learning, skill development, and collaboration within the team. - Conflict ResolutionAbility to manage team conflicts and provide constructive feedback to improve team dynamics. Create a better #TomorrowWithUs! This role is in Bangalore, where youll get the chance to work with teams impacting entire cities, countries- and the craft of things to come. Were Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow s reality. Find out more about the Digital world of Siemens here/digitalminds (http:///digitalminds)
Posted 1 month ago
15.0 - 25.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Project Role : IoT Architect Project Role Description : Design end-to-end IoT platform architecture solutions, including data ingestion, data processing, and analytics across different vendor platforms for highly interconnected device workloads at scale. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Roles & Responsibilities:1. Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. 2. Focused on designing, building, and managing the data architecture of industrial systems. 3. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. 4. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. 5. Create scalable and secure data structures, integrate with existing systems, and ensure efficient data flow. Professional & Technical Skills: 1. Must have Skills: Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life Science. 2. Data Modeling and Architecture:Proficiency in data modeling techniques (conceptual, logical, and physical models). 3. Knowledge of database design principles and normalization. 4. Experience with data architecture frameworks and methodologies (e.g., TOGAF). 5. Database Technologies:Relational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. 6. NoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data. 7. Graph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework). 8. Query Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. 9. Data Integration and ETL (Extract, Transform, Load):Proficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi). 10. Experience with data integration tools and techniques to consolidate data from various sources. 11. IoT and Industrial Data Systems:Familiarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA). 12. Experience with IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core. 13. Experience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache Flink 14. Ability to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow. 15. Understanding event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ. 16. Exposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge 17. AI/ML, GenAI:Experience working on data readiness for feeding into AI/ML/GenAI applications 18. Exposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. 19. Cloud Platforms:Experience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). 20. Data Warehousing and BI Tools:Expertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery). 21. Proficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. 22. Data Governance and Security:Understanding data governance principles, data quality management, and metadata management. 23. Knowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. 24. Big Data Technologies:Experience with big data platforms and tools such as Hadoop, Spark, and Apache Kafka. 25. Understanding distributed computing and data processing frameworks. Additional Info1. A minimum of 15-18 years of progressive information technology experience is required. 2. This position is based at Bengaluru location. 3. A 15 years full-time education is required. 4. AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Project Role : IoT Architect Project Role Description : Design end-to-end IoT platform architecture solutions, including data ingestion, data processing, and analytics across different vendor platforms for highly interconnected device workloads at scale. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Roles & Responsibilities:1. Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. 2. Focused on designing, building, and managing the data architecture of industrial systems. 3. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. 4. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. 5. Create scalable and secure data structures, integrate with existing systems, and ensure efficient data flow. Professional & Technical Skills: 1. Must have Skills: Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life Science. 2. Data Modeling and Architecture:Proficiency in data modeling techniques (conceptual, logical, and physical models). 3. Knowledge of database design principles and normalization. 4. Experience with data architecture frameworks and methodologies (e.g., TOGAF). 5. Database Technologies:Relational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. 6. NoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data. 7. Graph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework). 8. Query Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. 9. Data Integration and ETL (Extract, Transform, Load):Proficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi). 10. Experience with data integration tools and techniques to consolidate data from various sources. 11. IoT and Industrial Data Systems:Familiarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA). 12. Experience with IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core. 13. Experience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache Flink 14. Ability to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow. 15. Understanding event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ. 16. Exposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge 17. AI/ML, GenAI:Experience working on data readiness for feeding into AI/ML/GenAI applications 18. Exposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. 19. Cloud Platforms:Experience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). 20. Data Warehousing and BI Tools:Expertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery). 21. Proficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. 22. Data Governance and Security:Understanding data governance principles, data quality management, and metadata management. 23. Knowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. 24. Big Data Technologies:Experience with big data platforms and tools such as Hadoop, Spark, and Apache Kafka. 25. Understanding distributed computing and data processing frameworks. Additional Info1. A minimum of 15-18 years of progressive information technology experience is required. 2. This position is based at Bengaluru location. 3. A 15 years full-time education is required. 4. AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Qualification 15 years full time education
Posted 1 month ago
5.0 - 8.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Owning the Infrastructure - Co-ordination for provisioning, maintenance and monitoring Work closely with Platform Architecture and Engineering team to Improve/establish Governance Collaborate with use case team and Graph DB product team to enable the technical capabilities Onboarding of new use cases in Graph DB Owning the DevOps pipeline and processes - Create, enhance and maintenance Owning the admin related activities Graph DB new feature enablement, access, authorization ..etc.
Posted 1 month ago
3.0 - 8.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Responsibilities Develop and optimize prompts for AI systems to assist developers in generating high-quality code, improving documentation, and automating tasks. Designing, Developing and Refining prompts to enhance the quality and relevance of generated response. Monitor and analyze prompts performance to identify areas of improvements. Understand user behavior and requirements and develop accordingly. Stay proactive in identifying opportunities for innovation and improvement in our AI systems. Stay up to date with advancements in AI, NLP, and prompt engineering, and apply this knowledge to improve our products and processes. Conduct experiments to test and evaluate the effectiveness of different prompts in producing optimal results for end users. Qualifications Knowledge of LLM parameters and configuring these parameters to get different results for prompt. Understanding of all prompt elements and experience in modifying those elements to get better results. Experience with existing prompting techniques. Experience with using suitable prompts for various applications (e.g. Generating code, Generating data, etc.). Experience with RAG and handling hierarchical data (e.g. Codebase) for RAG. Knowledge of capabilities of various LLMs and how can we use different prompts efficiently to improve the capacity of LLMs. Knowledge of scaling, agents, efficiency, hallucination, prompt injection, etc. Understanding of Graph databases and Tabular databases and utilization of prompts to make LLM interact with these databases. Knowledge of latest research on prompt engineering for LLMs.
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Chennai
Work from Office
We are looking for immediate job openings on#Neo4j Graph Database _Chennai_Contract Skills: Neo4j Graph Database#Exp 3+ Years#Location Chennai#Notice PeriodImmediate#Employment Type:ContractJOB Description: Build Knowledge Graph solutions leveraging large-scale datasets Design and build graph database schemas to support various use cases including knowledge graphs Design and develop a Neo4j data model for a new application as per the use cases Design and build graph database load processes to efficiently populate the knowledge graphs Migrate an existing relational database (BigQuery) to Neo4j Build design/integration patterns for both batch and real-time update processes to keep the knowledge graphs in sync Work with stakeholders to understand the requirements and translate them into technical architecture Select and configure appropriate Neo4j features and capabilities as applicable for the given use case(s) Optimize the performance of a Neo4j-based recommendation engine Set up a Neo4j cluster in the cloud Configure Neo4j security features to protect sensitive data Ensure the security and reliability of Neo4j deployments Provide guidance and support to other developers on Neo4j best practices QualificationsMinimum 3+ years of working experience with knowledge graphs/graph databases Expertise with Graph database technology especially Neo4J Expertise with Python, and related software engineering platforms/frameworks Experience in designing and building highly scalable Knowledge Graphs in production Experience developing APIs leveraging knowledge graph data Experience with querying knowledge graphs using a graph query language (e.g. Cypher) Experience working with end-to-end CI/CD pipelines using frameworks The ideal candidate will have a strong knowledge of Graph solutions especially Neo4j, Python and have experience working with massive amounts of data in the retail space. Candidate must have a strong curiosity for data and a proven track record of successfully implementing graph database solutions with proficiency in software engineering.
Posted 1 month ago
8.0 - 10.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Graph Database. Experience8-10 Years.
Posted 1 month ago
4.0 - 8.0 years
7 - 11 Lacs
Itanagar
Work from Office
Our Company. We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market.. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole.. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us.. The team. We are seeking a talented and experienced Full Stack Developer to join our team, specializing in low-code/no-code process automation. The ideal candidate will be responsible for designing, developing, and implementing solutions using the Appian platform to streamline and automate business processes. This role involves working closely with business stakeholders to understand their requirements and deliver efficient, scalable, and user-friendly applications. Additionally, the candidate should have prior experience in building digital apps on Google Cloud Platform (GCP), Graph DB, and leveraging advanced technologies.. The role: Full Stack Developer/Specialist. Responsibilities. Design, develop, and maintain applications using the Appian low-code platform.. Collaborate with business stakeholders to gather requirements and translate them into technical specifications and functional designs.. Implement end-to-end solutions, including front-end interfaces and back-end services, ensuring seamless integration and functionality.. Develop and maintain custom components and integrations using Java, JavaScript, and other relevant technologies.. Optimize applications for performance, scalability, and user experience.. Conduct system testing, validation, and troubleshooting to ensure the quality and performance of solutions.. Provide training and support to end-users and IT staff on Appian functionalities and best practices.. Stay up-to-date with the latest developments in Appian, low-code/no-code platforms, and industry trends.. Participate in project planning, execution, and post-implementation support.. Mentor and guide junior developers, fostering a culture of continuous learning and improvement.. Qualifications. What you’ll bring. Bachelor’s degree in Computer Science, Information Technology, or a related field. A Master’s degree is a plus.. Proven experience as a Full Stack Developer with a strong portfolio of low-code/no-code applications.. Expertise in Appian development and customisation.. Proficiency in Java, JavaScript, and other relevant programming languages.. Strong understanding of front-end technologies such as HTML, CSS, and modern JavaScript frameworks.. Experience with RESTful APIs and web services.. Excellent problem-solving skills and attention to detail.. Strong communication and collaboration skills.. Ability to work independently and as part of a team in a fast-paced environment.. Prior experience in building custom digital apps with a strong SQL background leveraging Graph DB, BigQuery, and PostgresSQL is required.. Prior experience in building AI applications is a plus.. Prior experience with automation tools like UIPath is a plus.. Working knowledge of AI/Machine Learning, with a focus on Agentic AI, is a plus.. Preferred Skills. Certification in Appian development.. Experience with other low-code/no-code platforms.. Knowledge of DevOps practices and tools, including CI/CD pipelines.. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.. Experience with Agile development methodologies.. About Us. We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future.. Championing diversity, equity, and inclusion. Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team.. How We Look After You. We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with.. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.. Show more Show less
Posted 1 month ago
4.0 - 8.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Our Company. We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market.. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole.. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us.. The team. We are seeking a talented and experienced Full Stack Developer to join our team, specializing in low-code/no-code process automation. The ideal candidate will be responsible for designing, developing, and implementing solutions using the Appian platform to streamline and automate business processes. This role involves working closely with business stakeholders to understand their requirements and deliver efficient, scalable, and user-friendly applications. Additionally, the candidate should have prior experience in building digital apps on Google Cloud Platform (GCP), Graph DB, and leveraging advanced technologies.. The role: Full Stack Developer/Specialist. Responsibilities:. Design, develop, and maintain applications using the Appian low-code platform.. Collaborate with business stakeholders to gather requirements and translate them into technical specifications and functional designs.. Implement end-to-end solutions, including front-end interfaces and back-end services, ensuring seamless integration and functionality.. Develop and maintain custom components and integrations using Java, JavaScript, and other relevant technologies.. Optimize applications for performance, scalability, and user experience.. Conduct system testing, validation, and troubleshooting to ensure the quality and performance of solutions.. Provide training and support to end-users and IT staff on Appian functionalities and best practices.. Stay up-to-date with the latest developments in Appian, low-code/no-code platforms, and industry trends.. Participate in project planning, execution, and post-implementation support.. Mentor and guide junior developers, fostering a culture of continuous learning and improvement.. What You’ll Bring. Qualifications:. Bachelor’s degree in Computer Science, Information Technology, or a related field. A Master’s degree is a plus.. Proven experience as a Full Stack Developer with a strong portfolio of low-code/no-code applications.. Expertise in Appian development and customisation.. Proficiency in Java, JavaScript, and other relevant programming languages.. Strong understanding of front-end technologies such as HTML, CSS, and modern JavaScript frameworks.. Experience with RESTful APIs and web services.. Excellent problem-solving skills and attention to detail.. Strong communication and collaboration skills.. Ability to work independently and as part of a team in a fast-paced environment.. Prior experience in building custom digital apps with a strong SQL background leveraging Graph DB, BigQuery, and PostgresSQL is required.. Prior experience in building AI applications is a plus.. Prior experience with automation tools like UIPath is a plus.. Working knowledge of AI/Machine Learning, with a focus on Agentic AI, is a plus.. Preferred Skills. Certification in Appian development.. Experience with other low-code/no-code platforms.. Knowledge of DevOps practices and tools, including CI/CD pipelines.. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.. Experience with Agile development methodologies.. About Us. We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future.. Championing diversity, equity, and inclusion. Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team.. How We Look After You. We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with.. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.. Show more Show less
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job Title: Data Engineer. Job Type: Full-Time. Location: On-site Hyderabad, Telangana, India. Job Summary:. We are seeking an accomplished Data Engineer to join one of our top customer's dynamic team in Hyderabad. You will be instrumental in designing, implementing, and optimizing data pipelines that drive our business insights and analytics. If you are passionate about harnessing the power of big data, possess a strong technical skill set, and thrive in a collaborative environment, we would love to hear from you.. Key Responsibilities:. Develop and maintain scalable data pipelines using Python, PySpark, and SQL.. Implement robust data warehousing and data lake architectures.. Leverage the Databricks platform to enhance data processing and analytics capabilities.. Model, design, and optimize complex database schemas.. Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.. Lead and mentor junior data engineers and establish best practices.. Troubleshoot and resolve data processing issues promptly.. Required Skills and Qualifications:. Strong proficiency in Python and PySpark.. Extensive experience with the Databricks platform.. Advanced SQL and data modeling skills.. Demonstrated experience in data warehousing and data lake architectures.. Exceptional problem-solving and analytical skills.. Strong written and verbal communication skills.. Preferred Qualifications:. Experience with graph databases, particularly MarkLogic.. Proven track record of leading data engineering teams.. Understanding of data governance and best practices in data management.. Show more Show less
Posted 1 month ago
6.0 - 11.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Position Overview: We are seeking an experienced and skilled Senior Database Developer to join our dynamic team. The ideal candidate will have at least 8 years of hands-on experience in database development, with a strong focus on Neo4j (graph) databases. The role involves working on cutting-edge projects, contributing to data modelling, and ensuring the scalability and efficiency of our database systems. Responsibilities : Design, develop, and maintain databases, with a primary focus on Cypher/graph databases. Modify databases according to requests and perform tests. Advanced Query, performance tuning of databases and optimization of database systems. Solve database usage issues and malfunctions. Analyze all databases and monitor them for all design specifications and prepare associated test strategies. Evaluate and engineer efficient backup-recovery processes for various databases. Promote uniformity of database-related programming effort by developing methods and procedures for database programming Remain current with the industry by researching available products and methodologies to determine the feasibility of alternative database management systems, communication protocols, middleware, and query tools. Liaise with developers to improve applications and establish best practices. Ensure the performance, security, and scalability of database systems. Develop and optimize PL/SQL queries for efficient data storage and retrieval. Implement and maintain data models, ensuring accuracy and alignment with business needs. Train, mentor and motivate the junior team members. Contribute to assessing the teams performance evaluation. Stay updated on emerging database technologies and contribute to continuous improvement initiatives. Skills Required: 6+ years work experience as a Database developer Bachelor's or master's degree in computer science, Engineering, or a related field. Proficiency in Neo4j (graph) databases is mandatory. Strong experience with PL/SQL, data modeling, and database optimization techniques. Why us? Impactful Work: Your contributions will play a pivotal role in ensuring the quality and reliability of our platform. Professional Growth: We believe in investing in our employees' growth and development. You will have access to various learning resources, books, training programs, and opportunities to enhance your technical skills & expand your knowledge Collaborative Culture: We value teamwork and collaboration. You will work alongside talented professionals from diverse backgrounds, including developers, product managers, and business analysts, to collectively solve challenges and deliver exceptional software. Benefits: Health insurance covered for you and your family. Quarterly team outing, twice a month team lunch & personal and professional learning development session. Top performers win a chance on an international trip completely sponsored by the company.
Posted 1 month ago
9.0 - 14.0 years
20 - 35 Lacs
Bengaluru
Hybrid
AI Tech Lead to spearhead the architecture, development, deployment of advanced enterprise AI solutions built on the Microsoft Azure ecosystem, leveraging Azure Foundry, Vector Databases, and Graph-based Retrieval-Augmented Generation(Graph RAG). Required Candidate profile This strategic role involves leading the full-stack AI development—connecting SharePoint, SQL, custom application layers—to build intelligent bots, real-time decision systems, and AI transformation
Posted 1 month ago
5.0 - 10.0 years
20 - 35 Lacs
Pune, Gurugram
Work from Office
In one sentence We are seeking a highly skilled and adaptable Senior Python Developer to join our fast-paced and dynamic team. The ideal candidate is a hands-on technologist with deep expertise in Python and a strong background in data engineering, cloud platforms, and modern development practices. You will play a key role in building scalable, high-performance applications and data pipelines that power critical business functions. You will be instrumental in designing and developing high-performance data pipelines from relational to graph databases, and leveraging Agentic AI for orchestration. Youll also define APIs using AWS Lambda and containerised services on AWS ECS. Join us on an exciting journey where you'll work with cutting-edge technologies including Generative AI, Agentic AI, and modern cloud-native architectureswhile continuously learning and growing alongside a passionate team. What will your job look like? Key Attributes: Adaptability & Agility Thrive in a fast-paced, ever-evolving environment with shifting priorities. Demonstrated ability to quickly learn and integrate new technologies and frameworks. Strong problem-solving mindset with the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark. Define and implement smart data pipelines from RDBMS to Graph Databases. Build and expose APIs using AWS Lambda and ECS-based microservices. Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and scalable code following best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required. Ensure security, compliance, and observability across all development activities. All you need is... Required Skills & Experience Expert-level proficiency in Python with a strong grasp of Object oriented & functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions.
Posted 1 month ago
4.0 - 6.0 years
7 - 10 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be part of Researchs Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. Basic Qualifications and Experience: Doctorate Degree OR Masters degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelors degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation Expertise: Good experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-Solving: Excellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities Expertise in Graph Database : A deep understanding of Graph Database (Architecture, Structures, and Operations , query languages (such as SPARQL and Gremlin). Experience in AWS Neptune is preferred. Knowledge of Data Pipelines: Proficiency in designing and managing data pipelines is crucial for ensuring the efficient flow and transformation of data into the knowledge graph. High level of Proficiency in Python programming AWS services including EKS, K8s, S3, and Lambda Secondary Skills CI/CD , Kubernetes, Docker This is compulsory - Expertise in Graph Database and Python programming
Posted 1 month ago
4.0 - 6.0 years
8 - 11 Lacs
Bengaluru
Work from Office
Job Title: Python Developer MDM Integration (MongoDB & Neo4j) Location: Bengaluru Experience Level: 4+ Job Summary: We are seeking a skilled Python Developer to work on our Master Data Management (MDM) initiative. The ideal candidate will have hands-on experience with MongoDB , Neo4j , and data integration from MySQL using Debezium and Kafka . Key Responsibilities: Design and develop MDM solutions using MongoDB and Neo4j Build data ingestion pipelines from MySQL via Debezium and Apache Kafka Write clean, modular, and scalable Python code for data processing and transformation Collaborate with architects and data engineers to maintain data consistency across systems Optimize database queries and data models for performance and scalability Support data governance, lineage, and traceability efforts Required Skills & Qualifications: 4+ years of professional experience in Python development Strong knowledge of MongoDB and NoSQL data modelling Experience with Neo4j and graph database concepts Hands-on experience with Debezium and Kafka for real-time data streaming Solid understanding of MySQL and relational-to-NoSQL data mapping Experience working with REST APIs and microservices architecture Good problem-solving and communication skills Preferred Qualifications: Experience with data governance or MDM frameworks Knowledge of data lineage and data quality principles Familiarity with Docker and containerized environments Exposure to cloud platforms (e.g., AWS, Azure, or GCP)
Posted 1 month ago
2.0 - 5.0 years
4 - 8 Lacs
Chennai
Work from Office
Overview Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Responsibilities Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets. Java development with hands-on experience in Spring Boot. Strong knowledge of UI frameworks, particularly Angular, for developing dynamic, interactive web applications. Experience with Kubernetes for managing microservices-based applications in a cloud environment. Familiarity with Postgres (relational) and Neo4j (graph database) for managing complex data models. Experience in Meta Data Modeling and designing data structures that support high-performance and scalability. Expertise in Camunda BPMN and business process automation. Experience implementing rules with Drools Rules Engine. Knowledge of Unix/Linux systems for application deployment and management. Experience building data Ingestion Frameworks to process and handle large datasets.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough