Hiring Data Engineer (Work From Home)

4 - 8 years

8 - 18 Lacs

Posted:5 days ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

We are seeking a highly skilled Data Engineer with strong expertise in Databricks, ETL processes, Azure Cloud, and Python to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and architectures to support our data-driven initiatives.

Key Responsibilities:

  • Design and Develop

    Data

    Pipelines:

     Create and optimize scalable data pipelines on Databricks to process and analyze large datasets.
  • ETL Processes:

     Implement and manage ETL processes to ensure data is accurately extracted, transformed, and loaded from various sources ,DLT Framework: Utilize the Delta Live Tables (DLT) framework to build reliable, maintainable, and testable data pipelines. Leverage DLT's declarative approach to simplify the development of both batch and streaming data pipelines .
  • Azure Cloud Integration: Utilize Azure Cloud services to deploy and manage data solutions, ensuring high availability and performance.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
  • Data Quality and Integrity: Ensure data quality and integrity across various data sources and implement data governance best practices.
  • Performance Optimization: Monitor and optimize the performance of data pipelines and troubleshoot any issues that arise.
  • Documentation: Document data pipeline processes, technical specifications, and best practices.

Qualifications:

  • Education:

     Bachelor's degree in Computer Science, Engineering, or a related field.
  • Experience:

     4+ years of experience in data engineering.
  • Technical Skills:

    • Proficiency with Databricks and Apache Spark.
    • Strong knowledge of ETL processes and data warehousing concepts.
    • Experience with Azure Cloud services.
    • Strong programming skills in Python.
    • Proficiency in SQL and experience with relational databases.
    • Familiarity with big data technologies (e.g., Hadoop, Kafka) is a plus.
  • Soft Skills:

    • Excellent problem-solving and analytical skills.
    • Strong communication and collaboration abilities.
    • Ability to work in a fast-paced, dynamic environment.

Preferred Qualifications:

  • Experience with containerization (Docker, Kubernetes) and DevOps practices.
  • Certifications in Databricks, Azure, or related technologies.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Transformhub logo
Transformhub

IT Services and IT Consulting

Singapore Singapore

RecommendedJobs for You

gurugram, haryana, india

noida, uttar pradesh, india

hyderabad, telangana, india

gurugram, haryana, india