Associate Application Developer

1 - 4 years

10 - 12 Lacs

Posted:4 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking an experienced data bricks developer to support and evolve our enterprise data integration workflows. The ideal candidate should have robust hands-on experience with Azure Data Factory and Databricks, and a passion for building scalable, reliable ETL pipelines. This role is critical for both day-to-day operational reliability and long-term modernization of our data engineering stack in the Azure cloud.

What youll be DOING

What will your essential responsibilities include?

  • Develop and orchestrate PySpark notebooks in Azure Databricks for data transformation, cleansing, and enrichment.
  • Configure and manage Databricks clusters for performance optimization and cost efficiency.
  • Implement Delta Lake solutions that support ACID compliance, versioning, and time travel for reliable data lake operations.
  • Automate data workflows using Databricks Jobs and Azure Data Factory (ADF) pipelines.
  • Design and manage scalable ADF pipelines, including parameterized workflows and reusable integration patterns.
  • Integrate with Azure Blob Storage and ADLS Gen2 using Spark APIs for high-performance data ingestion and output.
  • Ensure data quality, consistency, and governance across legacy and cloud-based pipelines.
  • Collaborate with data analysts, engineers, and business teams to deliver clean, validated data for reporting and analytics.
  • Participate in the full Software Development Life Cycle (SDLC) from design through deployment, with an emphasis on maintainability and audit readiness.
  • Develop maintainable and efficient ETL logic and scripts following best practices in security and performance.
  • Troubleshoot pipeline issues across data infrastructure layers, identifying and resolving root causes to maintain reliability.
  • Create and maintain clear documentation of technical designs, workflows, and data processing logic for long-term maintainability and knowledge sharing.
  • Stay informed on emerging cloud and data engineering technologies to recommend improvements and drive innovation.
  • Follow internal controls, audit protocols, and secure data handling procedures to support compliance and operational standards.
  • Provide accurate time and effort estimates for assigned development tasks, accounting for complexity and risk.
  • Maintain, monitor, and troubleshoot existing Informatica PowerCenter ETL workflows to ensure operational reliability and data accuracy.
  • Enhance and extend ETL processes to support new data sources, updated business logic, and scalability improvements.

You will report to Application Manager.

What you will BRING

Were looking for someone who has these abilities and skills:

Required Skills and Abilities:

  • Expertise in Azure Databricks + Py Spark, including: Notebook development, Cluster configuration and tuning, Delta Lake (ACID, versioning, time travel), Job orchestration via Databricks Jobs or ADF, Integration with Azure Blob Storage and ADLS Gen2 using Spark APIs.
  • Effective hands-on experience with Azure Data Factory: Building and managing pipelines, Parameterization and dynamic datasets, Notebook integration and pipeline monitoring.
  • Proficiency in SQL, PL/SQL, and scripting languages such as Python, Bash, or PowerShell.
  • Outstanding understanding of data warehousing, dimensional modeling, and data profiling.
  • Familiarity with Github, CI/CD pipelines, and modern DevOps practices.
  • Working knowledge of data governance, audit trails, metadata management, and compliance standards.

Desired Skills and Abilities:

  • Effective problem-solving and troubleshooting skills with the ability to resolve performance bottlenecks and job failures.
  • Awareness of Azure Functions, App Services, API Management, and Application Insights.
  • Understanding of Azure Key Vault for secrets and credential management.
  • Familiarity with informatica PowerCenter ETL tool is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Axa Xl logo
Axa Xl

Insurance and Reinsurance

Greenwich

RecommendedJobs for You

Bengaluru East, Karnataka, India