Data Modeller - 8+yrs - Big4

10 - 17 years

13 - 23 Lacs

Pune, Delhi NCR, Hyderabad

Posted:2 months ago| Platform: Naukri logo

Apply

Skills Required

Cloud data modeler API Data Modeling Python SQL

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role - Data Modeler Exp - 8+yrs Location - Open Work Mode - Hybrid Job Description - Department: - The regulatory compliance and tax reporting team build complex IT systems which ensure that HSBC remains compliant with regulations from thousands of regulators around the globe. The team is dynamic, agile, fast paced and at the forefront of adopting next generation technologies like Cloud, APIs, Artificial Intelligence and Machine Learning, Generative AI, IDP, Kubernetes, RPA etc. In this role, you will: • Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data platforms • Oversee and govern the expansion of an existing data architecture and the optimization of data query performance using best practices • Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models • Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models • Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, AI/ML models and data visualization • Engage in hands-on modelling, design, configuration, installation, performance tuning, and sandbox POCs • Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks To be successful in this role, you should meet the following requirements: • Should be capable of developing/configuring data pipelines in a variety of platforms and technologies • Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. • Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) • Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage • Have experience with creating solutions which power AI/ML models and generative AI • Ability to work independently on specialized assignments within the context of project deliverables • Take ownership of providing solutions and tools that iteratively increase engineering efficiencies • Be capable of creating designs which help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of data pipelines • Be able to demonstrate problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge • Communicate openly and honestly using sophisticated oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount • Ability to deliver materials of the highest quality to management against tight deadlines • Ability to work effectively under pressure with competing and rapidly changing priorities

Mock Interview

Practice Video Interview with JobPe AI

Start Cloud Interview Now
Saieesh Data Systems
Saieesh Data Systems

Technology / Data Management

Metropolis

50-100 Employees

52 Jobs

    Key People

  • John Doe

    CEO
  • Jane Smith

    CTO

RecommendedJobs for You

Bengaluru / Bangalore, Karnataka, India

Chennai, Tamil Nadu, India

Bengaluru / Bangalore, Karnataka, India