4 - 7 years

25 Lacs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • We are seeking a highly skilled and motivated Data Engineer to join our growing team.
  • The ideal candidate will have a strong background in building and maintaining scalable data pipelines, with hands-on experience in ETL processes, Python programming, Apache Airflow, Apache Spark, and Neo4j graph database technology.
  • You will play a crucial role in designing, implementing, and managing our data infrastructure to support our data-driven initiatives.

Responsibilities:

  • Design, build, and maintain robust and scalable ETL pipelines

    to ingest data from a wide variety of sources.
  • Develop and optimize data processing workflows and data models for efficiency and reliability.
  • Utilize

    Apache Spark

    for large-scale data processing and complex data transformations.
  • Orchestrate and schedule data pipelines using

    Apache Airflow

    , ensuring timely and accurate data delivery.
  • Write clean, efficient, and well-documented

    Python

    code for data manipulation and automation.
  • Model, implement, and manage graph databases using

    Neo4j

    to uncover relationships and insights within our data.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver actionable1 insights.
  • Ensure data quality and integrity through rigorous testing and validation.
  • Monitor, troubleshoot, and resolve issues related to data pipelines and infrastructure.
  • Stay up-to-date with emerging technologies and best practices in data engineering.

Qualifications:

  • Proven experience as a Data Engineer or in a similar role.
  • Strong proficiency in

    Python

    and its data-related libraries (e.g., Pandas, PySpark).
  • Hands-on experience with

    ETL

    concepts and tools.
  • Demonstrable experience with

    Apache Spark

    for distributed data processing.
  • In-depth knowledge of workflow management and orchestration using

    Apache Airflow

    .
  • Experience with graph databases, specifically

    Neo4j

    , including data modeling and Cypher query language.
  • Solid understanding of SQL and NoSQL databases.
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration abilities.
  • Bachelor's degree in Computer Science, Engineering, or a related field.

Note

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Flatworld Solutions (FWS) logo
Flatworld Solutions (FWS)

IT Services and IT Consulting

Princeton New Jersey

RecommendedJobs for You

gurugram, haryana, india

hyderabad, telangana, india

noida, uttar pradesh, india

hyderabad, telangana, india