The Data Engineer is responsible for developing, maintaining, and leading the execution of Data ingestion, transformation, modelling for analytics solutions to improve performance.
It will be based in IT organization & reports to Team Lead GBS Reporting - Business Insights team
You will be partnering with different COE, business teams & also internal team to understand the requirements, collaborate with other Advanced Analytics team to ensure data availability. You will also be responsible for the delivering the required end-to-end Data solutions used for descriptive, diagnostic, predictive & prescriptive analytics within AkzoNobel.
Role & responsibilities
Requirement Gathering:
- Work with business partners / COE team leaders to gather, understand, and bridge definitions and requirements
- Discuss & align with enterprise solution architects on design & implementations of Data solution
- Determine opportunities / requirements for Data analytics solutions that can drive revenue growth, operational efficiencies, and customer value
Solution design & development:
Develop new solution using Data transformation tools (SQL Mngt Studio, Alteryx, Azure Platform Data bricks, Data Factory)
- Design, develop and maintain data pipelines to extract data from a variety of sources and populate data lake and data warehouse
- Fetch and embed data from SAP HANA, BW, Azure SQL, Excel, flat files, SharePoint list, Delta Lake, Databricks etc.
- Perform data transformation, Data modeling - [joints, exclusion, enrichment, pivoting, un-pivoting etc.]
- Integrate & prepare multi sourced - complex data efficiently
- Build scalable, re-usable Data models which can be used by multiple projects
- Automate tools/ data flow using best practices in collaboration with internal teams, IM architects and IM consultants
- Transform Alteryx workflows to Databricks solution
Improve and maintain existing solutions:
- Suggest technical improvements / automation ideas for existing Data solutions
- Perform required changes and regular data load activities
- Test and ensure compatibility of existing solutions upon new software version release
- Improve and safeguard Data solution System performance (latency, downtime)
- Educate and train a variety of stakeholders about the dashboard technicalities & features
- Work with data governance team and implement data quality checks and maintain data catalogs
- Keep detailed technical documentation complete and up to date
Preferred candidate profile
Required skills
- Strong ETL and data extraction skills.
- Strong SQL and/or MySQL skills, ability to perform effective querying involving multiple tables and subqueries.
- Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple data systems on premises and cloud-based data sources.
- Results-driven and ability to handle multiple projects.
- Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analyzing data.
- Strong problem solving, quantitative and analytical abilities.
- Understands sensitivity and security questions around data.
- Able to act in a complex multinational environment.
- Good communication and presentation skills.
- Strong interpersonal and time-management skill
Preferred skills
- Skills in Data Factory, Azure, Databricks, R, Python, different types of ETL tools
Good to have skills
- Skills with data visualization tools like Microsoft PowerBI,Tableau