Are you
Interested in working for an international and diverse company
Interested in working for a company that is dedicated to sustainability
Looking to use your troubleshooting skill
Looking for a friendly and supporting team
If so, read on!
Esko , a Veralto company, is a global provider of integrated software and hardware solutions that accelerate the go-to-market process of packaged goods. Our products empower teams to support and manage the packaging design and print processes for brand owners, retailers, pre-media and trade shops, manufacturers, and converters to provide the most innovative, integrated platform and comprehensive portfolio of tools that intelligently digitize, connect, automate, and accelerate the concept to market processes for every packaged product.
You will be part of a flexible, family friendly organization that cares about its people just as it cares about the environment.
We recognize that people come with a wealth of experience and talent. Diversity of experience and skills combined with passion is a key to innovation and excellence. Therefore, we encourage people from all backgrounds to apply to our positions.
This role will be remote with quarter visit to Bangalore for minimum of 2 weeks.
Experience summary
- ETL experience: 5 to 6 years total.
- Minimum 3 years hands-on with: Azure Data Factory (ADF), notebooks, PySpark, Python, AI/ML, data management, data quality, data architecture, data modelling, Microsoft Fabric, Medallion architecture, CI/CD pipelines, data governance
- Additional stack exposure: SSIS, SQL Server, ERP, CRM
- Fabric experience would be an added advantage
Technical focus
- 20% Data Architecture / 80% Data Engineering
Key responsibilities
- Architect, develop, and operate enterprise-grade data pipelines using Microsoft Fabric, ADF, PySpark, and Python, aligned to Medallion architecture
- Rationalize and modernize the BI data landscape; deliver curated bronze/silver/gold layers and performant semantic models
- Implement data quality, governance, lineage, and observability across pipelines; embed controls for auditability and compliance
- Migrate/retire legacy SSIS/SQL Server integrations to cloud-native, testable, and cost-efficient solutions
- Apply CI/CD for data engineering (branching, PRs, automated testing, environment promotion) using Azure DevOps/Git
- Diagnose and resolve complex integration issues across cloud-native and legacy platforms; optimize performance and costs
- Collaborate effectively with business analysts, transformation leads, and external partners; communicate progress and risks clearly
- Contribute to Agile delivery cycles, showing initiative and independence in problem solving
Required expertise
- Advanced proficiency in Azure-based data engineering: ADF, notebooks, PySpark, Python, Delta/Lakehouse patterns, and Microsoft Fabric
- Demonstrated success building resilient, scalable, and observable pipelines with Medallion architecture and solid data modelling
- Strong SQL skills; experience transitioning from SQL Server/SSIS to Azure/Fabric
- Hands-on CI/CD for data solutions (e.g., Azure DevOps), including automated deployments and quality gates
- Strong communication, stakeholder engagement, and teamwork; integrity and ownership mindset
- Able to pass rigorous employment background verification