Home
Jobs

Data Engineer - Asset Lending

2 - 5 years

4 - 7 Lacs

Posted:5 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About the role : Primary function of the role is to deliver high quality data engineering solutions to business and end users across Asset Lending (Asset Finance, Working Capital and Asset Based Lending businesses either directly via self-service data products, or by working closely with the Analytics team, providing modelled data warehouses on which they can add reporting and analytics. Reporting to the Head of ccc Technology, this role will fill a crucial role in bridging the gap between business needs, the requirements from the data analytics team and translating these into engineering delivery. Key Responsibilities : Work closely with end-users and Data Analysts to understand the business and their data requirements Carry out ad hoc data analysis and data wrangling using Synapse Analytics and Databricks Building dynamic meta-data driven data ingestion patterns using Azure Data Factory and Databricks Build and maintain the Enterprise Data Warehouse (using Data Vault 2.0 methodology) Build and maintain business focused data products and data marts Build and maintain Azure Analysis Services databases and cubes Share support and operational duties within the wider engineering and data teams Work with Architecture and Engineering teams to deliver on these projects. and ensure that supporting code and infrastructure follows best practices outlined by these teams. Help define test criteria to establish clear conditions for success and ensure alignment with business objectives. Manage their user stories and acceptance criteria through to production into day-to-day support Assist in the testing and validation of new requirements and processes to ensure they meet business need What are we looking for? Excellent data analysis and exploration using T-SQL Strong SQL programming (stored procedures, functions) Extensive experience with SQL Server and SSIS Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensional modelling, Data Vault 2.0) Experience in Azure one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2 Experience in building robust and performant ETL processes Build and maintain Analysis Services databases and cubes (both multidimensional and tabular) Experience in using source control & ADO Understanding and experience of deployment pipelines

Mock Interview

Practice Video Interview with JobPe AI

Start Data Analysis Interview Now

My Connections Investec Global Services

Download Chrome Extension (See your connection in the Investec Global Services )

chrome image
Download Now

RecommendedJobs for You

Hyderabad, Chennai, Bengaluru