Senior Data Engineer -Azure (Databricks, Data Factory, Data Lake

8 - 13 years

13 - 17 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Senior Data Engineer -Azure (Databricks, Data Factory, Data Lake Storage, SQL).
About The Role :
Job Summary
This position provides input and support for full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides knowledge, development, maintenance, and support for applications.Responsibilities:
  • Generates application documentation.
  • Contributes to systems analysis and design.
  • Designs and develops moderately complex applications.
  • Contributes to integration builds.
  • Contributes to maintenance and support.
  • Monitors emerging technologies and products.
  • Technical Skills :
  • Cloud Platforms: Azure (Databricks, Data Factory, Data Lake Storage, Synapse Analytics).
  • Data Processing: Databricks (PySpark, Spark SQL), Apache Spark.
  • Programming LanguagesPython, SQL
  • Data Engineering Tools: Delta Lake, Azure Data Factory, Apache Airflow
  • Other:Git, CI/CD
  • Professional Experience :
  • Design and implementation of a scalable data lakehouse on Azure Databricks, optimizing data ingestion, processing, and analysis for improved business insights.
  • Develop and maintain efficient data pipelines using PySpark and Spark SQL for extracting, transforming, and loading (ETL) data from diverse sources.(Azure and GCP).
  • Strong Proficiency in SQL & Develop SQL stored procedures for data integrity. Ensure data accuracy and consistency across all layers.
  • Implement Delta Lake for ACID transactions and data versioning, ensuring data quality and reliability.
  • Create frameworks using Databricks and Data Factory to process incremental data for external vendors and applications.
  • Implement Azure functions to trigger and manage data processing workflows.
  • Design and implement data pipelines to integrate various data sources and manage Databricks workflows for efficient data processing.
  • Conduct performance tuning and optimization of data processing workflows.
  • Provide technical support and troubleshooting for data processing issues.
  • Experience with successful migrations from legacy data infrastructure to Azure Databricks,improvingscalability and cost savings.
  • Collaborate with data scientists and analysts to build interactive dashboards and visualizations on Databricks for data exploration and analysis.

  • Effective oral and written management communication skills.Qualifications:
  • Minimum 8 years of Relevant experience
  • Bachelors Degree or International equivalent
  • Bachelor''s Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field

  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    UPS Supply Chain Solutions (UPS SCS) logo
    UPS Supply Chain Solutions (UPS SCS)

    Logistics and Supply Chain

    Atlanta

    RecommendedJobs for You