Azure-Databricks Engineer

3 years

30 Lacs

Posted:1 day ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

JOB SUMMARY

We are seeking Azure Cloud & Databricks Developer to join our offshore team, and he/she will be part of the Risk management group within the JRI Americas Division and is responsible for designing, developing and maintaining scalable data pipeline and ETL process using Azure cloud platform and Azure Databricks.

SCOPE

The Azure Cloud Data Engineer is responsible for strategizing, designing, and developing scalable cloud infrastructure and DevOps solutions. Working collaboratively with a team of skilled and passionate data engineers, this role plays a critical part in driving the automation and optimization of our Azure-based technology environment.

A key focus of this role is leveraging Azure Databricks to build and manage advanced data pipelines, perform large-scale data processing, and support analytics and machine learning initiatives. The person will contribute to the growth and scalability of our cloud infrastructure while also managing complex interface development, resolving technical issues, and providing support during weekend maintenance and production operations.

PRIMARY RESPONSIBILITIES

The job responsibilities are described herein:

  • Develop and maintain Databricks notebooks using Python and SQL
  • Configure and manage Databricks clusters and integrate with version control systems such as GitHub.
  • Enable seamless integration between on-premises databases and Power BI for reporting and analytics.
  • Design and build large-scale data pipelines using Azure native data processing frameworks.

· Collaborate with architects, engineers, analysts, and business stakeholders to deliver enterprise-grade, data-driven solutions.

· Provide technical leadership and guidance on cloud architecture and implementation strategies.

· Coordinate with platform teams, Azure API Management (APIM), GitHub, and support teams to ensure smooth operations.

· Analyze business requirements and design scalable, secure, and efficient solutions on the Azure cloud platform.

· Develop, test, and optimize software components to enhance the performance and reliability of data platforms.

· Lead end-to-end project execution, working closely with business users, IT teams, data stewards, and third-party vendors.

· Integrate and standardize data from diverse sources while ensuring compliance with data quality and accessibility standards.

· Implement streaming data solutions and reusable design patterns in a big data environment.

· Collaborate with data scientists to operationalize machine learning models and algorithms within automated data workflows.

· Apply sound judgment and technical expertise to resolve moderately complex data engineering challenges.

· Review and provide feedback on core code changes and support production deployments.

CORE TECHNOLOGIES

Azure: Azure Databricks, Azure Data Factory, Azure Synapse Analytics, Azure Functions, Azure Data Lake Storage Gen2, Azure Event Grid, Azure Event Hubs, Azure Service Bus, Azure Key Vault, Azure Monitor, Azure Log Analytics, Azure API Management (APIM), Azure DevOps.

Scripting: Python, SQL, Bash.

Databases: SQL Server, Oracle, PostgreSQL, Delta Lake

Big Data: Apache Spark

Version Control: GitHub, Git, Azure DevOps

Visualization: Power BI & Integration with REST APIs for custom dashboards.

Data Integration & Workflow Orchestration: Azure Data Factory, Databricks Workflows

QUALIFICATIONS

  • IT professional experience in Azure Cloud with Minimum 3 years of experience in developing and maintaining data pipelines using Azure Databricks, Spark, and other Big Data technologies.
  • Proficiency in programming languages such as Python & SQL
  • Ability to recreate existing legacy application logic and functionality into Azure Databricks/Data Lake, SQL Database and SQL Datawarehouse environment.
  • Experience with Azure services such as Data Factory, Azure Machine Learning, and Azure DevOps.
  • Strong understanding of ETL processes and data warehousing concepts.
  • Excellent interpersonal and communication skills
  • Experience with software configuration management tools such as Git/GitHub

Job Type: Full-time

Pay: Up to ₹250,000.00 per month

Benefits:

  • Health insurance

Education:

  • Bachelor's (Preferred)

Experience:

  • Databricks: 2 years (Required)
  • total work: 4 years (Preferred)
  • Azure: 3 years (Required)

Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You