IT engineer data lakehouse - Sales & Finance

3 - 6 years

13 - 17 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Description

- Design, develop, and operate scalable and maintainable data pipelines in the Azure Databricks environment
- Develop all technical artefacts as code, implemented in professional IDEs, with full version control and CI/CD automation

- Global delivery footprint; cross-functional data engineering support across Sales & Finance domains

- Collaboration with business stakeholders, functional IT partners, product owners, architects, ML/AI engineers, and Power BI developers

Main Tasks


Design scalable batch and streaming pipelines in Azure Databricks using PySpark and/or Scala

Implement ingestion from structured and semi-structured sources (e.g., SAP, APIs, flat files)

Implement use-case driven dimensional models (star/snowflake schema) tailored to Sales & Finance needs

Ensure compatibility with reporting tools (e.g., Power BI) via curated data marts and semantic models

Implement enterprise-level data warehouse models (domain-driven 3NF models) for Sales & Finance data, closely aligned with data engineers for other business domains

Develop and apply master data management strategies (e.g., Slowly Changing Dimensions)

Develop automated data validation tests using frameworks

Monitor pipeline health, identify anomalies, and implement quality thresholds

Develop and structure pipelines using modular, reusable code in a professional IDE

Apply test-driven development (TDD) principles with automated unit, integration, and validation tests

Document pipeline logic, data contracts, and technical decisions in markdown or auto-generated docs from code

Align designs with governance and metadata standards (e.g., Unity Catalog)

Profile and tune data transformation performance

Reduce job execution times and optimize cluster resource usage

Qualifications

Degree in Computer Science, Data Engineering, Information Systems, or related discipline.

Certifications in software development and data engineering (e.g., Databricks DE Associate, Azure Data Engineer, or relevant DevOps certifications).

3-6 years of hands-on experience in data engineering roles in enterprise environments. Demonstrated experience building production-grade codebases in IDEs, with test coverage and version control.

Proven experience in implementing complex data pipelines and contributing to full lifecycle data projects (development to deployment)

Experience in at least one business domain: Sales & Finance or a comparable field

Experience working in international teams across multiple time zones and cultures, preferably with teams in India, Germany, and the Philippines.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Continental logo
Continental

Motor Vehicle Manufacturing

Hannover Lower Saxony

RecommendedJobs for You