Posted:14 hours ago|
Platform:
Work from Office
Full Time
As a Finance Analytics Architect, the candidate will be responsible for working with finance analytics product owners to define data product architectures. An ideal candidate will posses a mix of technical knowledge and finance domain functional knowledge. Basic knowledge of finance function in a corporate setting is a must have.
The candidate will work with finance analytics product owners and analytics engineering leaders to define solution architectures for data products. This would include the data models, pipelines, security model and data product consumption methods. The candidate will apply best practices to create data architectures that are secure, scalable, cost-effective, efficient, reusable, and resilient. The candidate will participate in technical discussions, present their architectures to stakeholders for feedback, and incorporate their input. The candidate will offer design oversight and guidance during project execution, ensuring solutions align with strategic business and IT goals.
This role demands expertise in Snowflake s advanced features and cloud platforms, along with a passion for mentoring junior engineers
Requirement :
Experience in building Datawarehouse / data lakes / data lake houses
Experience in architecting Datawarehouse for at least 1 functional domain (e.g. Finance, Supply Chain)
Experience / Knowledge of finance data concepts as Accounts payables, receivables, general ledger, financial close etc.
Experience or knowledge of Snowflake features and functionality.
Expertise in complex SQL, python scripting, and performance tuning.
Understanding of Snowflake Data engineering practices and dimensional modeling. Optimize data models for performance and scalability.
Experience with data security and data access controls in Snowflake.
Expertise in setting up security framework and overall governance and different compliances (Ex: SOX)
Advanced SQL skills for building queries to create various monitors in Snowflake. Good Expertise in setting up Resource monitors for compute usage, security etc.
Participate in solution planning, incremental planning, product demos, and inspect and adapt events
Plan and develop the architectural runway for the listed products that supports desired business outcomes
Lead and participate in the planning, definition, development, and high-level design of solutions and architectural alternatives
Excellent analysis and documentation skills, including prototyping
Demonstrated ability to analyze and interpret complex business processes and systems and how these relate to data requirements
Extensive experience utilizing best practices in data engineering and data visualization.
Advanced experience in creating interactive analytics solutions using technologies like Power BI and Python.
Extensive experience with cloud platforms such as Azure, including cloud-based data storage and processing technologies.
Expertise in dimensional (Star, Snowflake, Data Vault) and transactional (3rd normal form) data modeling using OLTP, OLAP, NoSQL, and Big Data technologies. Familiarity with data frameworks and storage platforms like Cloudera, Databricks, Dataiku, Snowflake, dbt, Coalesce, and data mesh.
Experience developing and supporting data pipelines, including code, orchestration, quality, and observability.
Expert-level programming ability in multiple data manipulation languages (Python, Spark, SQL, PL-SQL). Intermediate ability to interact with batch (ETL/ELT), on-demand (SQL), and streaming (messaging, queues) data integration methodologies.
Intermediate experience with DevOps and CI/CD principles and tools. Proficiency in Azure Data Factory.
Experience with data governance frameworks and tools to ensure data quality, privacy, and compliance. Solid understanding of cybersecurity concepts such as encryption, hashing, certificates etc.
Demonstrate strong analytical skills, to critically evaluate the data presented and mapped from the multiple sources, reconcile data conflicts, decompose high-level data into details, abstract up from low-level information to a more general understanding.
Continually learn new modules, ETL tools and programming techniques to provide value to our business by enhancing self-knowledge and capability
Maintain awareness of new technologies that are relevant to our environment at Eaton.
Established as a key data leader at Enterprise level
Eaton Technologies
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now20.0 - 25.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.
chennai
7.0 - 9.0 Lacs P.A.
50.0 - 65.0 Lacs P.A.
45.0 - 55.0 Lacs P.A.
35.0 - 60.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.
kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru
10.0 - 15.0 Lacs P.A.
14.0 - 18.0 Lacs P.A.
50.0 - 65.0 Lacs P.A.