Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Vice President - Data Modeller at HSBC, you will play a crucial role in organizing and transforming complex Finance data into actionable insights through data model structures. With your strong analytical mindset and expertise in various data modelling techniques and tools, you will design and implement efficient data models that meet the data needs of the Finance business within a global financial institution. Your responsibilities will include developing conceptual, logical, and application data models aligned with HSBC's Future State Architecture, supporting Finance teams in migrating to target state data models, creating data modelling schemas in line with Enterprise data models, and contributing to the continuous improvement of the data modelling estate while ensuring compliance with regulatory standards. You will serve as a subject matter expert in data modelling, collaborate with cross-functional teams and stakeholders, maintain various data modelling documents, translate Finance business requirements into data modelling solutions, and conduct audits of data models to ensure alignment with Enterprise data models and architecture principles. To excel in this role, you should have a minimum of 10 years" experience in Data management and modelling within the Financial Services sector, preferably in a Treasury/Finance function. Experience in a large and global banking environment, knowledge of Agile and Scrum methodologies, and familiarity with data governance principles are essential. Additionally, exposure to leading data modelling tools, cloud solutions, big data technologies, and ETL architectures will be advantageous. Join HSBC and be part of a culture that values professional development, flexible working, and inclusivity. Your contributions as a Vice President - Data Modeller will be instrumental in driving the organization's strategic data needs and supporting Finance business objectives. HSBC is committed to creating a workplace where every employee is respected, valued, and provided with opportunities to grow. Your personal data will be handled in accordance with the Bank's Privacy Statement, reflecting our dedication to maintaining a secure and inclusive environment for all employees.,
Posted 1 week ago
6.0 - 9.0 years
18 - 27 Lacs
Bangalore Rural, Gurugram, Bengaluru
Work from Office
Data Modelling,Star/Snowflake schema, Normalisation/Denormalization Snowflake,Schema design,Performance tuning, Time Travel, Streams & Tasks,Secure & Materialised Views,SQL & Scripting: Advanced SQL (CTEs, Window Functions), Automation & optimisation
Posted 1 week ago
4.0 - 9.0 years
10 - 20 Lacs
Pune
Hybrid
Hi, Greetings! This is regarding a job opportunity for the position of Data Modeller with a US based MNC in Healthcare Domain. This opportunity is under the direct pay roll of US based MNC. Job Location: Pune, Mundhwa Mode of work: Hybrid (3 days work from office) Shift timings: 1pm to 10pm About the Company: The Global MNC is a mission-driven startup transforming the healthcare payer industry. Our secure, cloud-enabled platform empowers health insurers to unlock siloed data, improve patient outcomes, and reduce healthcare costs. Since our founding in 2017, we've raised over $81 million from top-tier VCs and built a thriving SaaS business. Join us in shaping the future of healthcare data. With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. As a result, these payers can ingest and manage all the information they need to transform their business by supporting their analytical, operational, and financial needs through our platform. Since our founding in 2017, it has built a highly successful SaaS business, raising more than $80 Million by leading VC firms with profound expertise in the healthcare and technology industries. We are solving massive complex problems in an industry ready for disruption. We're building powerful momentum and would love for you to be a part of it! Interview process: 5 rounds of interview 4 rounds of Technical Interview 1 round of HR or Fitment discussion Job Description: Data Modeller About the Role: Were seeking a Data Modeler to join our global data modeling team. Youll play a key role in translating business requirements into conceptual and logical data models that support both operational and analytical use cases. This is a high-impact opportunity to work with cutting-edge technologies and contribute to the evolution of healthcare data platforms. What Youll Do Design and build conceptual and logical data models aligned with enterprise architecture and healthcare standards. Perform data profiling and apply data integrity principles using SQL. Collaborate with cross-functional teams to ensure models meet client and business needs. Use tools like Erwin, ER/Studio, DBT, or similar for enterprise data modeling. Maintain metadata, business glossaries, and data dictionaries. Support client implementation teams with data model expertise. What Were Looking For 2+ years of experience in data modeling and cloud-based data engineering. Proficiency in enterprise data modeling tools (Erwin, ER/Studio, DBSchema). Experience with Databricks, Snowflake, and data lakehouse architectures. Strong SQL skills and familiarity with schema evolution and data versioning. Deep understanding of healthcare data domains (Claims, Enrollment, Provider, FHIR, HL7, etc.). Excellent collaboration and communication skills. In case you have query, please feel free to contact me on the below mention email or whatsapp or call. Thanks & Regards, Priyanka Das Email: priyanka.das@dctinc.com Contact Number: 74399 37568
Posted 2 weeks ago
7.0 - 12.0 years
30 - 45 Lacs
Hyderabad, Gurugram, Bengaluru
Work from Office
Senior Data Modeller Telecom Domain Job Location: Anywhere in India ( Preferred location - Gurugram , Noida , Hyderabad , Bangalore ) Experience: 7+ Years Domain: Telecommunications Job Summary: We are hiring a Senior Data Modeller with strong telecom domain expertise. You will design and standardize enterprise-wide data models across domains like Customer, Product, Billing, and Network, ensuring alignment with TM Forum standards (SID, eTOM). You'll collaborate with cross-functional teams to translate business needs into scalable, governed data structures, supporting analytics, ML, and digital transformation. Key Responsibilities: Design logical/physical data models for telecom domains Align models with TM Forum SID, eTOM, ODA, and data mesh principles Develop schemas (normalized, Star, Snowflake) based on business needs Maintain data lineage, metadata, and version control Collaborate with engineering teams on Azure, Databricks implementations Tag data for privacy, compliance (GDPR), and data quality Required Skills: 7+ years in data modelling, 3+ years in telecom domain Proficient in TM Forum standards and telecom business processes Hands-on with data modeling tools (SSAS, dbt, Informatica) Expertise in SQL, metadata documentation, schema design Cloud experience: Azure Synapse, Databricks, Snowflake Experience in CRM, billing, network usage, campaign data models Familiar with data mesh, domain-driven design, and regulatory frameworks Education: Bachelors or Masters in CS, Telecom Engineering, or related field Please go the JD and If you are interested, kindly share your updated resume along with the following details:Few bullet points on Current CTC (fixed plus variable) Offer in hand (fixed plus variable) Expected CTC Notice period Few points on relevant skills and experience Email: sp@intellisearchonline.net
Posted 3 weeks ago
10.0 - 14.0 years
30 - 45 Lacs
Pune, Bengaluru
Work from Office
About Position: Familiarity with Azure cloud platform. Proficiency in data modeling tools (e.g., Erwin, IBM Info Sphere Data Architect), database management systems (e.g., SQL Server, Oracle, MySQL), and ETL tools. Role: Data Modeller Location: Pune, Bangalore Experience: 10- 14 Job Type: Full Time Employment What You'll Do: Data Architecture Design: Develop and maintain data architecture strategies and frameworks, including data models, data flow diagrams, and data integration processes to support business objectives. Data Modeling: Create and maintain logical and physical data models, ensuring they align with business requirements and data governance standards. Database Design: Design and optimize database schemas, ensuring performance, scalability, and security are considered in database design and implementation. Data Integration: Oversee the integration of data from various sources, including data warehouses, data lakes, and third-party applications, to ensure a unified and accurate data environment. Stakeholder Collaboration: Work closely with business stakeholders, data engineers, and analysts to gather requirements, understand data needs, and translate them into technical specifications. Data Governance: Implement data governance practices to ensure data quality, consistency, and compliance with industry regulations and company policies. Performance Optimization: Monitor and optimize the performance of data systems, identifying and resolving issues related to data quality, access, and integration. Documentation: Maintain comprehensive documentation of data architecture, data models, and integration processes to support knowledge sharing and future development. Expertise You'll Bring: Data Architecture Design: Develop and maintain data architecture strategies and frameworks, including data models, data flow diagrams, and data integration processes to support business objectives. Data Modeling: Create and maintain logical and physical data models, ensuring they align with business requirements and data governance standards. Database Design: Design and optimize database schemas, ensuring performance, scalability, and security are considered in database design and implementation. Data Integration: Oversee the integration of data from various sources, including data warehouses, data lakes, and third-party applications, to ensure a unified and accurate data environment. Stakeholder Collaboration: Work closely with business stakeholders, data engineers, and analysts to gather requirements, understand data needs, and translate them into technical specifications. Data Governance: Implement data governance practices to ensure data quality, consistency, and compliance with industry regulations and company policies. Performance Optimization: Monitor and optimize the performance of data systems, identifying and resolving issues related to data quality, access, and integration. Documentation: Maintain comprehensive documentation of data architecture, data models, and integration processes to support knowledge sharing and future development. Experience: 10-14 years of experience in data architecture, data modeling, and database design, with a strong understanding of data integration and management. Technical Skills: Must have - Familiarity with Azure cloud platform. Proficiency in data modeling tools (e.g., Erwin, IBM InfoSphere Data Architect), database management systems (e.g., SQL Server, Oracle, MySQL), and ETL tools. Analytical Skills: Strong analytical and problem-solving skills, with the ability to translate complex business requirements into effective data solutions. Communication Skills: Excellent verbal and written communication skills, with the ability to present technical information to non-technical stakeholders and collaborate effectively with cross-functional teams. Attention to Detail: Detail-oriented with a strong focus on data accuracy, consistency, and quality. Experience with data warehousing concepts and technologies Understanding of data privacy and security best practices. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Lets' unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."
Posted 3 weeks ago
6.0 - 8.0 years
10 - 18 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Should be capable of developing/configuring data pipelines in a variety of platforms and technologies Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage Have experience with creating solutions which power AI/ML models and generative AI Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies Be capable of creating designs which help embed standard processes, systems and operational models into the BAU approach for end-to-end execution of data pipelines Be able to demonstrate problem solving and analytical abilities including the ability to critically evaluate information gathered from multiple sources, reconcile conflicts, decompose high-level information into details and apply sound business and technical domain knowledge Communicate openly and honestly using sophisticated oral, written and visual communication and presentation skills - the ability to communicate efficiently at a global level is paramount Ability to deliver materials of the highest quality to management against tight deadlines Ability to work effectively under pressure with competing and rapidly changing priorities Preferred candidate profile
Posted 3 weeks ago
5.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 4 weeks ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered
Posted 1 month ago
9.0 - 14.0 years
22 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Description - Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Confirmation Questions: 1. This is a permanent employment opportunity with Pyramid IT . Are you fine with this? (Y/N) 3. This position may require working in shifts. Are you fine with this? (Y/N) 4. Current CTC: 5. Expected CTC: 6. Total Experience | Relevant Experience: 7. Official Notice Period: 8. If selected, how soon can you join (in days)? 9. Do you have a Notice Period Buyout option? (Y/N) 10. Preferred day & time for a telephonic/F2F interview on weekdays: 11. Are you currently holding any other offer? (Y/N) 12. Reason for job change: 13. PAN Card Number: 14. Aadhar Number: 15. Alternate Contact Number: Your prompt response along with your updated profile will be highly appreciated. Looking forward to your confirmation. Thanks & Regards Tagore Laya Sr. Executive - Resourcing M 7842100374 E Tagore.Laya@pyramidconsultinginc.com www.pyramidci.com
Posted 1 month ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only
Posted 1 month ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Chennai
Work from Office
Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.
Posted 1 month ago
6.0 - 10.0 years
20 - 30 Lacs
Pune
Hybrid
Experience Role purpose: Strong understanding of end-to-end impact assessment across all subject areas. Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. Preferred candidate profile :- Data Modeller, Data Architeture Architecture Experance Ranges :- 6+ Years Location :- Pune(Hybrid)
Posted 1 month ago
9.0 - 14.0 years
20 - 32 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Data Architect/Data Modeler role: Scope & Responsibilities: Enterprise data architecture for a functional domain or for a product group. Design and governs delivery of the Domain Data Architecture / ensures delivery as per design. Ensures consistency in approach for the data modelling of the different solutions of the domain. Design and ensure delivery of data integration across the solutions of the domain. General Expertise: Critical: Methodology expertise on data architecture and modeling from business requirements and functional specifications to data modeling. Critical: Data warehousing and Business intelligence data products modeling (Inmon/Kimball/Data Vault/Codd modeling patterns). Business/Functional knowledge of the domain. This requires business terminology understanding, knowledge of business processes related to the domain, awareness of key principles and objectives, business trends and evolution. Master data management / data management and stewardship processes awareness. Data persistency technologies knowledge: SQL (Ansi-2003 for structured relational data querying & Ansi-2023 for XML, JSON, Property Graph querying) / Snowflake specificities, database structures for performance optimization. NoSQL – Other data persistency technologies awareness. Proficient level of Business English & “Technical writing” Nice to have: Project delivery expertise through agile approach and methodologies experience/knowledge: Scrum, SAFe 5.0, Product-based-organization. Technical Stack expertise: SAP Power Designer modeling (CDM, LDM, PDM) Snowflake General Concepts and specifically DDL & DML, Snow Sight, Data Exchange/Data Sharing concepts AWS S3 & Athena (as a query user) Confluence & Jira (as a contributing user) Nice to have: Bitbucket (as a basic user)
Posted 2 months ago
5.0 - 10.0 years
15 - 27 Lacs
Pune, Chennai, Bengaluru
Hybrid
Roles and Responsibilities: Review, refine, and maintain data models for enterprise applications. Add or remove fields in data structures ensuring forward compatibility and minimal disruption to downstream systems. Collaborate with data architects, engineers, and business analysts to gather requirements and translate them into effective data designs. Ensure consistency, accuracy, and integrity across all data models and documentation. Communicate effectively with business stakeholders to understand requirements and present data modeling concepts. Maintain data model documentation using modeling tools like ER/Studio. Provide recommendations on data modeling best practices and standards. Support integration with SAP data models and APIs where applicable. Work with Azure data services such as Azure Data Lake, Azure Synapse, etc. Must-Have Skills: Proven experience in data modeling , including creating and modifying data structures. Strong understanding of forward compatibility and version control for data changes. Excellent communication skills and the ability to engage with business stakeholders. Basic working knowledge of Azure (e.g., Azure Data Lake, Azure SQL, Synapse). Solid understanding of relational databases and enterprise data architecture principles. Good-to-Have Skills: Experience with ER/Studio or similar data modeling tools (ERwin, PowerDesigner). Exposure to or experience with SAP data models and API design/integration . Understanding of data governance and metadata management . Familiarity with Agile methodology and tools like JIRA or Confluence. Skills Data Modeller,Api,Azure,Communication
Posted 2 months ago
6 - 10 years
20 - 30 Lacs
Pune
Hybrid
Experience Role purpose: Strong understanding of end-to-end impact assessment across all subject areas. Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. Preferred candidate profile :- Data Modeller, Data Architeture Architecture Experance Ranges :- 6+ Years Location :- Pune(Hybrid)
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough