9 - 12 years

9 - 15 Lacs

Posted:4 hours ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Senior Data Engineer

Roles & Responsibilities

  • Graph Data Engineering:

    Design, build, and maintain robust data pipelines using

    Databricks

    (

    Spark, Delta Lake, PySpark

    ) for complex graph data processing workflows.
  • Graph Database Optimization:

    Own the implementation of graph-based data models, capturing complex relationships and hierarchies. Build and optimize

    Graph Databases

    such as

    Stardog, Neo4j, Marklogic

    , or similar, to support query performance, scalability, and reliability.
  • Query Implementation:

    Implement graph query logic using

    SPARQL, Cypher, Gremlin, or GSQL

    , depending on platform requirements.
  • Data Integration & Analytics:

    Collaborate with data architects to integrate graph data with existing data lakes and warehouses. Work closely with data scientists and analysts to enable graph analytics, link analysis, and recommendation systems.
  • Metadata & Governance:

    Develop metadata-driven pipelines and lineage tracking for graph and relational data processing. Ensure data quality, governance, and security standards are met across all graph data initiatives.
  • Mentorship & Innovation:

    Mentor junior engineers and contribute to data engineering best practices, especially around graph-centric patterns and technologies. Stay up-to-date with the latest developments in graph technology, graph ML, and network analytics.

Qualifications

  • A Master's or Bachelor's degree in Computer Science, IT, or a related field with relevant experience.
  • Hands-on experience in

    Databricks

    , including

    PySpark

    and

    Delta Lake

    .
  • Hands-on experience with graph database platforms such as

    Stardog, Neo4j, or Marklogic

    .
  • Strong understanding of

    graph theory, graph modeling

    , and traversal algorithms.
  • Proficiency in workflow orchestration and performance tuning on big data processing.
  • Strong understanding of

    AWS services

    .
  • Experience with software engineering best practices, including version control (

    Git

    ),

    CI/CD

    (Jenkins), and automated unit testing.
  • AWS Certified Data Engineer

    ,

    Databricks Certificate

    , or

    Scaled Agile SAFe certification

    is preferred.

Soft Skills

  • Problem-Solving:

    Excellent analytical and troubleshooting skills, with the ability to quickly learn, adapt, and apply new technologies.
  • Collaboration:

    Excellent collaboration and communication skills, with experience working with

    Scaled Agile Framework (SAFe)

    and

    DevOps

    practices.
  • Proactiveness:

    High degree of initiative and self-motivation, with the ability to manage multiple priorities successfully.
  • Communication:

    Strong verbal and written communication skills, including presentation and public speaking skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

hyderabad, telangana, india

pune, chennai, bengaluru

mumbai, maharashtra, india