Posted:4 hours ago|
Platform:
Work from Office
Full Time
Responsibilities
Support developers in designing and handling graph schemas (vertices, edges, loading jobs);
Install and configure TigerGraph clusters; Optimize data loading pipelines (batch & streaming); Monitor data skew and partition balance across cluster nodes; Create and handle users, roles, and privileges using TigerGraph’s RBAC model; Audit query activity and ensure compliance with data governance policies; Monitor query performance, memory usage, disk I/O, analyze GSQL query plans and optimize long-running traversals; Validate HA setup and ensure zero data loss within recovery objectives; Apply TigerGraph version upgrades and security patches safely; Handle node recovery, rebalancing, and partition reallocation; Collaborate with data engineers and platform teams for architecture and scaling; Handle day-to-day operations and issues provide the detailed RCA for outages; Ability to thrive in a fast-paced, tight deadline delivery timeline; Participate in new technology/extension/feature evaluation, design, and development of highly scalable distributed databases; Position require 12x7 on-call support rotation with other team members.
Minimum Qualifications
8+ years working experience;
Proficiency in at least one programming language (Python, Java, or JavaScript) for data integration and application development; Experience with graph database platforms such as TigerGraph, Neo4j, Amazon Neptune, JanusGraph, or ArangoDB; Solid grasp of data modeling concepts, graph theory basics, and relationship driven architectures; Familiarity with ETL processes, API integrations, or data ingestion frameworks; Knowledge of cloud services (AWS, Azure, GCP) and handled graph database offerings; Understanding of RDF, SPARQL, or semantic web technologies; Exposure to big data ecosystems (Spark, Kafka) for graph data processing; Experience with version control systems (Git) and CI/CD workflows; Ability to handle the databases deployed in Cloud Infrastructure including AWS/GCP and Kubernetes.
Preferred Qualifications
BE/B.Tech or ME/M.Tech in Computer Science, Information Systems, or related field practical experience;
8+ years of hands-on database engineering experience, with at least 5 years working directly with graph databases; Strong proficiency in at least one graph query language: such as Cypher, Gremlin, GSQL; Certified Kubernetes Administrator (CKA); AWS Certified Solutions Architect - Associate; Python and Shell/bash knowledge.
Apple
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
bengaluru
25.0 - 30.0 Lacs P.A.
bengaluru
Salary: Not disclosed
bengaluru
8.24 - 12.0 Lacs P.A.
bengaluru
25.0 - 30.0 Lacs P.A.
13.0 - 17.0 Lacs P.A.
hyderabad, pune, bengaluru
6.0 - 9.0 Lacs P.A.
20.0 - 35.0 Lacs P.A.
hyderabad, chennai, bengaluru
17.0 - 22.5 Lacs P.A.
bengaluru
5.0 - 8.0 Lacs P.A.
bengaluru
25.0 - 30.0 Lacs P.A.