Finance Analytics Architect

7 - 11 years

20 - 25 Lacs

Posted:14 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Summary

As a Finance Analytics Architect, the candidate will be responsible for working with finance analytics product owners to define data product architectures. An ideal candidate will posses a mix of technical knowledge and finance domain functional knowledge. Basic knowledge of finance function in a corporate setting is a must have.

The candidate will work with finance analytics product owners and analytics engineering leaders to define solution architectures for data products. This would include the data models, pipelines, security model and data product consumption methods. The candidate will apply best practices to create data architectures that are secure, scalable, cost-effective, efficient, reusable, and resilient. The candidate will participate in technical discussions, present their architectures to stakeholders for feedback, and incorporate their input. The candidate will offer design oversight and guidance during project execution, ensuring solutions align with strategic business and IT goals.
This role demands expertise in Snowflake s advanced features and cloud platforms, along with a passion for mentoring junior engineers

Job Responsibilities
  • Work closely with product owners to drive synergies and define, implement, and support Eatons data mesh strategy. As partners to product owners, incumbent of this role is expected to help product owners in articulating the techno-functional requirements.
  • Work with project managers to define stakeholders, project planning (scope, schedule and budget)
  • Work closely with data engineers, system architects, and product owners to drive synergies and define, implement, and support Eatons data mesh strategy. Ensure data products are designed for scalability, supportability, and reusability.
  • Lead the design and architecture of data products and solutions that meet business needs and align with the overall data strategy. Create complex enterprise datasets that adhere to enterprise technology and data protection standards.
  • Deliver strategic infrastructure and data pipelines for optimal data extraction, transformation, and loading. Document solutions with architecture diagrams, dataflows, code comments, data lineage, entity relationship diagrams, and technical and business metadata.
  • Design, engineer, and orchestrate scalable, supportable, and reusable datasets. Manage non-functional requirements, technical specifications, and compliance
  • Assess technical capabilities across Value Streams to facilitate the selection and alignment of technical solutions following enterprise guardrails. Execute proof of concepts (POCs) where applicable.
  • Oversee enterprise solutions for a wide range of data technology patterns and platforms. Collaborate with senior business stakeholders, functional analysts, and data scientists to deliver robust data solutions aligned with quality measures (availability, completeness, accuracy, etc.)
  • Support continuous integration and continuous delivery, maintaining architectural runways for a series of products within a Value Chain. Implement data governance frameworks and tools to ensure data quality, privacy, and compliance.
  • Develop and support advanced data solutions and tools. Leverage advanced data visualization tools like Power BI to enhance data insights. Manage data sourcing and consumption integration patterns from Eatons data platform, Snowflake, ensuring seamless data flow and accessibility.
  • Accountable for end-to-end delivery of source data acquisition, complex transformation and orchestration pipelines, and front-end visualization
  • Strong communication and presentation skills; Leading collaboration directly with business stakeholders to deliver rapid, incremental business value/outcomes
Qualification

Requirement :

  • BE in Computer Science, Electrical , Electronics/ Any other equivalent Degree.
  • 10 plus years of experience
Skills

Experience in building Datawarehouse / data lakes / data lake houses
Experience in architecting Datawarehouse for at least 1 functional domain (e.g. Finance, Supply Chain)
Experience / Knowledge of finance data concepts as Accounts payables, receivables, general ledger, financial close etc.
Experience or knowledge of Snowflake features and functionality.
Expertise in complex SQL, python scripting, and performance tuning.
Understanding of Snowflake Data engineering practices and dimensional modeling. Optimize data models for performance and scalability.
Experience with data security and data access controls in Snowflake.
Expertise in setting up security framework and overall governance and different compliances (Ex: SOX)
Advanced SQL skills for building queries to create various monitors in Snowflake. Good Expertise in setting up Resource monitors for compute usage, security etc.

Participate in solution planning, incremental planning, product demos, and inspect and adapt events

Plan and develop the architectural runway for the listed products that supports desired business outcomes

Lead and participate in the planning, definition, development, and high-level design of solutions and architectural alternatives

Excellent analysis and documentation skills, including prototyping
Demonstrated ability to analyze and interpret complex business processes and systems and how these relate to data requirements
Extensive experience utilizing best practices in data engineering and data visualization.
Advanced experience in creating interactive analytics solutions using technologies like Power BI and Python.
Extensive experience with cloud platforms such as Azure, including cloud-based data storage and processing technologies.
Expertise in dimensional (Star, Snowflake, Data Vault) and transactional (3rd normal form) data modeling using OLTP, OLAP, NoSQL, and Big Data technologies. Familiarity with data frameworks and storage platforms like Cloudera, Databricks, Dataiku, Snowflake, dbt, Coalesce, and data mesh.
Experience developing and supporting data pipelines, including code, orchestration, quality, and observability.
Expert-level programming ability in multiple data manipulation languages (Python, Spark, SQL, PL-SQL). Intermediate ability to interact with batch (ETL/ELT), on-demand (SQL), and streaming (messaging, queues) data integration methodologies.
Intermediate experience with DevOps and CI/CD principles and tools. Proficiency in Azure Data Factory.
Experience with data governance frameworks and tools to ensure data quality, privacy, and compliance. Solid understanding of cybersecurity concepts such as encryption, hashing, certificates etc.
Demonstrate strong analytical skills, to critically evaluate the data presented and mapped from the multiple sources, reconcile data conflicts, decompose high-level data into details, abstract up from low-level information to a more general understanding.
Continually learn new modules, ETL tools and programming techniques to provide value to our business by enhancing self-knowledge and capability
Maintain awareness of new technologies that are relevant to our environment at Eaton.
Established as a key data leader at Enterprise level

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Eaton Technologies logo
Eaton Technologies

Appliances, Electrical, and Electronics Manufacturing

Dublin Ireland

RecommendedJobs for You

kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru