Home
Jobs

Commercial Analytics - Data Engineer (Snowflake)

4 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

What You’ll Do The Data Engineer will play a vital role in the Digital Finance and Innovation (DFI) team, contributing to the creation of advanced versions of the Commercial Data Hub. This role involves data management, data pipeline creation, ensuring data quality and integrity, supporting data analytics, and ensuring the performance and scalability of current and future data models. The Data Engineer will work closely with the data architecture and data platform team, product owner team, and project management team to develop finance data platforms using Snowflake in accordance with best practices. Key Responsibilities ETL Pipelines: Create ETL pipelines using PySpark and SQL in Snowflake. Data Models: Design and build new data models and optimize existing ones. Pipeline Reliability: Ensure the reliability and availability of data pipelines. Data Analysis: Perform data analysis for troubleshooting and exploration. Documentation: Document the code and processes. Collaboration: Collaborate with the Product Owner and Scrum Master to understand requirements and convert them into technology deliverables. Team Support: Assist other team members and ensure the team's success. Data Architecture: Design, create, deploy, and manage data architecture and data models. Data Integration: Define how data will be stored, consumed, integrated, and managed within Snowflake or other data hub repositories. Data Research: Conduct scalable data research and develop new data wrangling processes. Data Pipelines: Develop and maintain scalable data pipelines and build new integrations to support increasing data volume and complexity. Data Quality: Implement processes and systems to monitor data quality, ensuring production data is accurate and available for key stakeholders. Testing: Write unit/integration tests and document work. Root Cause Analysis: Perform root cause analysis on data-related issues and assist in their resolution. Qualifications Graduate degree in Computer Science/IT. 4+ years of relevant experience in a Data Engineer role using Snowflake with strong Python skills. Skills Strong proficiency in SQL, Python, and Snowflake. Data Modeling: Experience with data modeling, stored procedure development. Large Scale Data Hubs: Experience working on large-scale data hubs. Analytical Skills: Solid analytical and problem-solving skills. Communication: Excellent communication and collaboration skills. Independence: Ability to work independently and as part of a team. Snowflake and Azure: Basic knowledge of Snowflake, Azure Data Services, including Azure Data Factory and Azure SQL Database. Finance Domain: Knowledge of the Commercial finance domain is a big plus Excellent communication skills to interact with stakeholders and gather requirements. ]]> Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Analytics Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Eaton
Eaton

254 Jobs

RecommendedJobs for You