10 - 17 years
13 - 23 Lacs
Pune, Delhi NCR, Hyderabad
Posted:2 months ago|
Platform:
Work from Office
Full Time
Role - Data Modeler Exp - 8+yrs Location - Open Work Mode - Hybrid Job Description - Department: - The regulatory compliance and tax reporting team build complex IT systems which ensure that HSBC remains compliant with regulations from thousands of regulators around the globe. The team is dynamic, agile, fast paced and at the forefront of adopting next generation technologies like Cloud, APIs, Artificial Intelligence and Machine Learning, Generative AI, IDP, Kubernetes, RPA etc. In this role, you will: • Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data platforms • Oversee and govern the expansion of an existing data architecture and the optimization of data query performance using best practices • Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) • Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models • Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models • Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, AI/ML models and data visualization • Engage in hands-on modelling, design, configuration, installation, performance tuning, and sandbox POCs • Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks To be successful in this role, you should meet the following requirements: • Should be capable of developing/configuring data pipelines in a variety of platforms and technologies • Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. • Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) • Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage • Have experience with creating solutions which power AI/ML models and generative AI • Ability to work independently on specialized assignments within the context of project deliverables • Take ownership of providing solutions and tools that iteratively increase engineering efficiencies • Be capable of creating designs which help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of data pipelines • Be able to demonstrate problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge • Communicate openly and honestly using sophisticated oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount • Ability to deliver materials of the highest quality to management against tight deadlines • Ability to work effectively under pressure with competing and rapidly changing priorities
Saieesh Data Systems
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Pune, Delhi NCR, Hyderabad
13.0 - 23.0 Lacs P.A.
Hyderābād
6.0 - 10.0 Lacs P.A.
Hyderābād
Experience: Not specified
6.0 - 10.0 Lacs P.A.
Panchkula
12.0 - 12.0 Lacs P.A.
0.25 - 0.3 Lacs P.A.
Chennai
1.984 - 4.35444 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 8.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
3.0 - 8.0 Lacs P.A.