Posted:16 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

  • Collaborate with core engineering, customers, and solution engineering teams for functional and technical discovery sessions to understand requirements and architect TigerGraph-based solutions.
  • Prepare and deliver compelling product demonstrations, live software prototypes, and proof-of-concepts showcasing TigerGraph's multi-modal Graph + Vector capabilities, including hybrid search for AI applications.
  • Participate in an on-call rotation to ensure 24/7 support coverage for enterprise deployments.
  • Create and maintain public documentation, internal knowledge base articles, FAQs, and best practices for TigerGraph implementations.
  • Design efficient graph schemas, develop GSQL queries and algorithms, and build prototypes that address customer requirements (e.g., Fraud Detection, Recommendation Engines, Knowledge Graphs, Entity Resolution, Anti-Money Laundering, and Cybersecurity).
  • Optimize indexing strategies, partitioning, and query performance in TigerGraph's distributed environment, leveraging GSQL for parallel processing and real-time analytics.
  • Lead large-scale production implementations of TigerGraph solutions for enterprise clients, ensuring seamless integration with existing systems like Kafka for streaming, K8s for orchestration, and cloud platforms.
  • Provide expert guidance on Graph Neural Networks (GNNs), Retrieval-Augmented Generation (RAG), semantic search, and AI-driven optimizations to enhance customer outcomes.
  • Troubleshoot complex issues in distributed systems, including networking, load balancing, and performance monitoring.
  • Foster cross-functional collaboration, including data modeling sessions, whiteboarding architectures, and stakeholder management to validate solutions.
  • Drive customer success through exceptional service, project management, and communication of TigerGraph's value in AI/enterprise use cases.

Requirements

  • Graph and Vector Data Science**: 4+ years applying graph algorithms, vector embeddings, and data science techniques for enterprise analytics.
  • **SQL Expertise**: 4+ years in SQL for querying, performance tuning, and debugging in relational and graph contexts.
  • **Graph Databases and Platforms**: 4+ years with TigerGraph or similar systems, focusing on multi-modal graph + vector integrations.
  • **Programming & Scripting**: 4+ years in Python, C++, and automation tools for task management, issue resolution, and GSQL development.
  • **HTTP/REST and APIs**: Expertise in building and integrating RESTful services for database interactions.
  • **Linux and Systems**: Strong background in Linux administration, scripting (bash/Python), and distributed environments.
  • **Kafka and Streaming**: Experience with Kafka for real-time data ingestion and event-driven architectures.
  • **K8s (Kubernetes)**: Proficiency in container orchestration for scalable deployments.
  • **Customer Service and Project Management**: 4+ years delivering exceptional support, managing projects, and ensuring client success.
  • **Cloud Computing**: 4+ years with AWS, Azure, or GCP for virtualization, deployments, and hybrid setups.
  • **Graph Neural Networks (GNNs) and Graph Machine Learning**: Hands-on with frameworks like PyTorch Geometric for predictive analytics on graphs.
  • **Retrieval-Augmented Generation (RAG) and Semantic Search**: Building pipelines with vector embeddings and LLMs for AI applications.
  • **Big Data Processing Tools**: Proficiency in Apache Spark, Hadoop, or Flink for distributed workloads.
  • **AI-Driven Database Management and Optimization**: Skills in AI-enhanced query optimization and performance tuning.
  • **Data Governance, Security, and Compliance**: Knowledge of encryption, access controls, GDPR/HIPAA, and ethical AI practices.
  • **DevOps and CI/CD Pipelines**: Advanced use of Git, Jenkins, or ArgoCD for automation.
  • **Real-Time Analytics and Streaming Integration**: Beyond Kafka, experience with Flink or Pulsar.
  • **Multimodal Data Handling**: Managing text, images, video in graph + vector setups.
  • **Monitoring & Observability Tools**: 4+ years with Prometheus, Grafana, Datadog, or ELK Stack.
  • **Networking & Load Balancing**: Proficient in TCP/IP, load balancers (NGINX, HAProxy), and troubleshooting.
  • **Agile Methodologies and Tools**: 3+ years with Scrum/Agile, JIRA, or Confluence.
  • **Presentation and Technical Communication**: Advanced whiteboarding, architecture reviews, and demos.
  • **Cross-Functional Collaboration**: Leading discovery, data modeling (UML, ER diagrams), and on-call incident management.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Agivant Technologies logo
Agivant Technologies

Information Technology

Tech City

RecommendedJobs for You

Hyderabad, Telangana, India

chennai, tamil nadu

Mumbai, Maharashtra, India