BALI Graph Solutions

3 Job openings at BALI Graph Solutions
Neo4j Data Engineer Jammu & Kashmir,India 5 years None Not disclosed Remote Full Time

**Hiring Alert** Role: Neo4j data engineer with databricks experience (5+ years) Location: Remote/Jammu NP: Max 30 days Skills: Neo4j, Cypher, Databricks, SQL, ETL, PySpark, Python. It will be a permanent role with other benefits.

Neo4j Graph Developer karnataka 3 - 7 years INR Not disclosed On-site Full Time

You will be working as a full-time Neo4j Graph Developer in a company that is a Neo4j certified Service Solution partner. Your main responsibilities will include designing and developing graph-based applications, implementing back-end services, optimizing graph databases, and writing efficient code. It is essential for you to have expertise in Neo4j/Tiger Graph, experience with GCP's ETL related services, hands-on experience with K8s, and previous work in Agile methodology. Additionally, you will need to spend 2 days a week working from the office in either Pune or Bengaluru. Your role will involve close collaboration with the team to ensure the smooth integration and performance of the graph database system.,

Data Engineer (3+ YOE) jammu,jammu & kashmir,india 4 years None Not disclosed On-site Full Time

Job Description We are looking for a skilled Data Engineer to join our data team and build scalable data infrastructure on Google Cloud Platform (GCP) . You’ll design and maintain data pipelines, optimize warehousing solutions, and enable data-driven insights by collaborating with analysts, scientists, and engineers. Responsibilities Build and maintain data pipelines with Dataform/DBT and Apache Airflow Develop and optimize data warehouses in BigQuery Write efficient SQL and Python for data processing and automation Implement data quality checks and monitoring Ensure pipelines are scalable, reliable, and cost-effective Partner with data teams to deliver actionable insights Stay current with GCP data engineering best practices Qualifications Bachelor’s in Computer Science or related field (or equivalent experience) 3–4 years as a Data Engineer Strong SQL (BigQuery preferred) and Python skills Hands-on GCP data services experience Solid knowledge of data warehousing and ETL/ELT Experience with Dataform/DBT and Airflow Strong analytical, problem-solving, and collaboration skills Preferred Google Cloud Professional Data Engineer certification Knowledge of data modeling (dimensional, star schema) Familiarity with Agile methodologies