Posted:2 weeks ago|
Platform:
Work from Office
Full Time
-Lead and manage the global IT team for data engineering that develops all technical artefacts as code, implemented in professional IDEs, with full version control and CI/CD automation. This team combines both lakehouse modeling of common and business use ase artefacts & semantics, as well as generalist data integration & metadata services.
-Ensure high-quality delivery of data engineering assets that enable business analytics, AI/ML integration, and data governance at scale.
-Direct management of approx. 10 15 data engineers (generalists and specialists). Reports to the global head of Data & Analytics within the IT Competence Center.
-Team delivers data engineering & analytics assets via Product Owner for data & analytics to all business domains.
Main Tasks
- Line management for a high-performing, cross-functional data engineering team.
- Drive skill development, mentorship, and performance management.
- Own timely delivery of data & analytics assets from data acquisition to semantic layers.
- Align work with business priorities and architectural standards.
- Act as primary escalation and coordination point across business domains.
- Bridge infrastructure, functional IT, cybersecurity, and platform decisions.
- Guide adoption of engineering best practices (TDD, CI/CD, IaC) & guide building all technical artefacts as code, creating scalable batch and streaming pipelines in Azure Databricks using PySpark and/or Scala
-Leading the design and operation of scalable batch/stream pipelines in Databricks, including ingestion from structured/semi-structured sources and implementation of bronze/silver/gold layers under lakehouse governance.
- Build an inclusive, high-performance team culture in Bengaluru.
- Champion DevSecOps, reuse, automation, and reliability. Commit all artifacts to version control with peer review and CI/CD integration
-Leading the design and operation of scalable, secure ingestion services including CDC, delta, full-load, and SAP extractions via tools like Theobald Extract Universal.
-Overseeing integration with APIs, legacy systems, Salesforce, and file-based sources, while aligning all interfaces with cybersecurity standards and compliance protocols.
Degree in Computer Science, Data Engineering, Information Systems, or related discipline.
Certifications in software development and data engineering (e.g., Databricks DE Associate, Azure Data Engineer, or relevant DevOps certifications).
Minimum 8 years in enterprise data engineering, including data ingestion and pipeline design. Experience across structured and semi-structured source systems is required. Demonstrated experience building production-grade codebases in IDEs, with test coverage and version control.
Hands-on experience with secure SAP/API ingestion, lakehouse development in Databricks, and metadata-driven data platforms. Delivered high-impact enterprise data products in cross-functional environments.
At least 3 years of team leadership or technical lead experience, including hiring, mentoring, and representing team interests in enterprise-wide planning forums.
Demonstrated success leading globally distributed teams and collaborating with stakeholders across multiple time zones and cultures.
Continental
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
25.0 - 30.0 Lacs P.A.
hyderabad
10.0 - 14.0 Lacs P.A.
coimbatore
10.0 - 14.0 Lacs P.A.
hyderabad
9.0 - 14.0 Lacs P.A.
hyderabad
9.0 - 14.0 Lacs P.A.
chennai
6.0 - 10.0 Lacs P.A.
chennai
10.0 - 15.0 Lacs P.A.
thiruvananthapuram
10.0 - 15.0 Lacs P.A.
ahmedabad
6.0 - 7.5 Lacs P.A.