Posted:3 days ago|
Platform:
Work from Office
Full Time
Engineer, ETL Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities This data should not only be high quality, but also actionable - enabling AXA XL s executive leadership team to maximize benefits and facilitate sustained industrious advantage Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer The role will support the team s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner What you ll be DOING What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers Apply best practices in Data architecture For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning Leading and hands-on execution of research into new technologies Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform Design prototypes and work in a fast-paced iterative solution delivery model Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables Use Harness for deployment pipeline Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed Diagnose system performance issue related to data processing and implement solution to address them Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement Maintain integrity and quality across all pipelines and environments Understand and follow secure coding practice to make sure code is not vulnerable You will report to Technical Lead What you will BRING We re looking for someone who has these abilities and skills: Required Skills and Abilities: Effective Communication skills Bachelor s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc), application development, advanced data querying skills Relevant years of programming experience using Databricks Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS) Solid knowledge on network and firewall concepts Solid experience writing, optimizing and analyzing SQL Relevant years of experience with Python Ability to break complex data requirements and architect solutions into achievable targets Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile Experience using Harness Technical lead responsible for both individual and team deliveries Desired Skills and Abilities: Worked in big data migration projects Worked on performance tuning both at database and big data platforms Ability to interpret complex data requirements and architect solutions Distinctive problem-solving and analytical skills combined with robust business acumen Excellent basics on parquet files and delta files Effective Knowledge of Azure cloud computing platform Familiarity with Reporting software - Power BI is a plus Familiarity with DBT is a plus Passion for data and experience working within a data-driven organization You care about what you do, and what we do
XL India Business Services Pvt. Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Gurugram, Haryana, India
Experience: Not specified
Salary: Not disclosed
Gurugram, Haryana, India
Experience: Not specified
Salary: Not disclosed
4.0 - 7.0 Lacs P.A.
Kochi, Bengaluru
7.0 - 12.0 Lacs P.A.
35.0 - 45.0 Lacs P.A.
Hyderābād
Experience: Not specified
4.885 - 7.65 Lacs P.A.
Hyderabad, Telangana, India
Experience: Not specified
Salary: Not disclosed
5.0 - 9.0 Lacs P.A.
0.5 - 2.25 Lacs P.A.
5.0 - 15.0 Lacs P.A.