Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
pune, maharashtra, india
Remote
**************************************Immediate Joiners Only************************************* We’re Hiring: Data Modeler | Asset Management Domain | Pune/Bangalore (Hybrid) Location: Pune / Bangalore Work Mode: Hybrid (3 days WFO, 2 days WFH) Shift: UK hours (9 AM – 5 PM GMT) Experience: 10+ years (Level 3) About the Role We are seeking an experienced Data Modeler / Data Architect to join our team in the Asset Management / Security Servicing space. You will play a key role in building and managing data models (conceptual, logical, physical) and ensuring trusted, high-quality, business-aligned data solutions on Snowflake and Azure platforms. Key Responsibilities Create and manage conceptual, logical & physical data models across multiple domains (accounts, holdings, transactions). Design and optimize Snowflake data solutions (clustering, partitioning, schema best practices). Drive data architecture & governance, ensuring data is trusted, understood, and easy to use. Collaborate with Data Product Owners, engineers, and business stakeholders to deliver scalable solutions. Document and present architecture/design decisions, ensuring alignment with enterprise standards. Must-Have Skills 10+ years in Enterprise Data Architecture, Data Modelling, Database Engineering. Strong expertise in OLTP/OLAP design, Data Warehouse, ELT/ETL processes. Proven experience in Snowflake (data sharing, streams, tasks, schema design). Proficiency with Enterprise Modelling tools (Erwin, PowerDesigner, IBM Infosphere). Hands-on with Azure Data Factory, Synapse, Databricks, Cosmos DB. Excellent SQL skills (performance tuning, query optimization, indexing). Knowledge of ontology & taxonomy design and regulatory/financial data standards (BIAN, ACORD, ESG). Strong communication skills with ability to work across tech & business teams. Nice to Have 3+ years in security servicing / asset management / investment domain. Familiarity with MDM, data vault, information architecture frameworks. Exposure to Python / R for data analysis. Strong presentation skills (architecture diagrams, stakeholder discussions). Company Profile: Creospan is a subsidiary of Creospan Inc., headquartered in Chicago, IL. Since our inception in 1999, we have grown into a leading technology consulting and solutions provider, serving clients across Telecom, Technology, Manufacturing, Ecommerce, Insurance, Banking, Transportation, and Healthcare industries. Our team specializes in emerging technologies, helping clients build next-generation products that drive innovation. For more details, visit our website: www.creospan.com
Posted 1 week ago
10.0 - 12.0 years
3 Lacs
pune
On-site
Date: Sep 6, 2025 Job Requisition Id: 62541 Location: Hyderabad, TG, IN Pune, MH, IN Pune, IN Indore, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Modeling Professionals in the following areas : Experience 10-12 Years Job Description Design and develop conceptual, logical, and physical data models for OLTP and OLAP systems. Collaborate with business analysts and data engineers to understand data requirements. Ensure models support data governance , data quality , and data integration standards. Maintain and update data models as business needs evolve. Create and manage metadata repositories and data dictionaries . Optimize data models for performance , scalability , and security . Work with database administrators to implement models in relational and NoSQL databases. Support ETL processes and data warehousing initiatives. Proven experience in data modeling tools (e.g., Erwin, SAP PowerDesigner, IBM InfoSphere). Strong understanding of relational databases , dimensional modeling , and normalization techniques . Experience with SQL , data warehousing , and ETL tools . Familiarity with cloud platforms (e.g., AWS, Azure) is a plus. Excellent communication and documentation skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
10.0 - 12.0 years
0 Lacs
hyderabad, telangana, india
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Modeling Professionals in the following areas : Experience 10-12 Years Job Description Design and develop conceptual, logical, and physical data models for OLTP and OLAP systems. Collaborate with business analysts and data engineers to understand data requirements. Ensure models support data governance, data quality, and data integration standards. Maintain and update data models as business needs evolve. Create and manage metadata repositories and data dictionaries. Optimize data models for performance, scalability, and security. Work with database administrators to implement models in relational and NoSQL databases. Support ETL processes and data warehousing initiatives. Proven experience in data modeling tools (e.g., Erwin, SAP PowerDesigner, IBM InfoSphere). Strong understanding of relational databases, dimensional modeling, and normalization techniques. Experience with SQL, data warehousing, and ETL tools. Familiarity with cloud platforms (e.g., AWS, Azure) is a plus. Excellent communication and documentation skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering And Analysis Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture Tools And Frameworks Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture Concepts And Principles Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Accountability Required Behavioral Competencies Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration Shares information within team, participates in team activities, asks questions to understand other points of view. Agility Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
12.0 - 16.0 years
9 - 13 Lacs
hyderabad, bengaluru
Work from Office
Roles and Responsibilities: Individual contributor/developing architect. Main scope of work is to provide solution architecture development, consultancy and assurance to projects, making sure applications are well designed and conform to Shell standards and reference/segment architectures. Works as an autonomous IT Architect, translates the guidelines and standards into practice and solves common technical challenges. Works within given frameworks, company standards and policies under defined directions while receiving occasional guidance on techniques. Often team member or working on smaller architect tasks with some oversight. Makes use of the standard architecture tools (IBM System Architect, PowerDesigner, and/or ProVision) Stakeholders are Project Managers and Business Teams JG2/3. Understands the IT Strategic Roadmap and applies within the context of their organizational assignment. Uses the related reference architectures and roadmaps for Cloud, Mobile, Integration, OpenSource, Big Data, etc. Mandatory skills: Strong stakeholder management, pro-active communication, engage across teams. Strong communication skills (written and verbal), communicate viewpoints and decisions using content appropriate for the audience. Business skills, apply industry sector knowledge, establish linkage between business
Posted 1 week ago
162.0 years
0 Lacs
bengaluru, karnataka, india
On-site
About the Company: Birlasoft, a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. We take pride in our consultative and design thinking approach, driving societal progress by enabling our customers to run businesses with unmatched efficiency and innovation. As part of the CKA Birla Group, a multibillion-dollar enterprise, we boast a 12,500+ professional team committed to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our dedication to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. Senior Data Modeler The Sr Data Architect is responsible for leading architecture capabilities for the Data Governance Organization. This role includes creating, maintaining, and communicating conceptual and logical enterprise data models and data flow diagrams. The Sr Data Architect will have a deep understanding of the organization’s Customer, Supplier, Product, Financial, and Employee master data, the data lifecycle of this content as it exists in ancillary applications and provide guidance on software and process modifications to improve controls and simplify master data management. This role will work closely with Data Analysts and Stewards to inventory, define, classify, and measure data quality of organizational data assets and partner with Business Owners and IT Leads to simplify and streamline the data environment. What You'll Be Doing: • Understand and document current end-to-end data flow diagrams of business subject areas and re-architect data ingestion processes into relevant applications • Collaborating with Data Integration Engineers and Business Owners to standardize B2B and B2C data exchange formats • Lead the development and implementation of data standards and best practices for data governance, data management, and data quality • Assist Data Governance Organization team members in the cataloging, defining, securing, and measuring of organizational data and information assets • Provide guidance on security, privacy, data management, and regulatory compliance around our data assets • Provide technical vision, leadership, and guidance to architects, analysts, and stewards on the team • Perform other duties as assigned What You'll Bring: • 10+ years of experience as a data professional • 5+ years working with multiple Database Management Software (SQLServer, Oracle, Snowflake, etc.) • 3+ years serving explicitly in a Data Architect or Data Modeler role • 3+ years utilizing data modeling tools (ERwin, ER/Studio, PowerDesigner, etc.) • 3+ years creating conceptual, logical and physical data model • Bachelor’s degree in relevant area • Knowledge of enterprise-level business function/capability and data modeling • Prior experience in a complex, highly integrated services environment • Prior experience with Master Data Management, Metadata Management, and/or canonical modeling • Familiarity with Industry Standard and Healthcare Standard data formats (ISO, ANSI X12, HL7, etc.) • Familiarity with services-based development and technologies (e.g. SOAP, XML, XSD, JSON) • Familiarity with NoSQL platforms • Excellent written and oral communication skills • Proficiency in MS Office (Word, Excel, PowerPoint, Visio) • Excellent analytical and problem-solving skills • Candidates need to work well in extended project teams with little oversigh
Posted 1 week ago
4.0 years
0 Lacs
gurugram, haryana, india
On-site
Seeking 2 reqs: an experienced Snowflake Architect with strong Azure and ADF expertise to design and deliver enterprise-grade data platforms. Another req exists for a 4-5 year similar experience Senior Developer. The candidate will be responsible for architecting scalable cloud data warehouse solutions, integrating with Azure services, and ensuring performance, governance, and security across the data ecosystem. Key Responsibilities • Design and implement Snowflake-based Data Warehouse solutions on Azure Cloud. • Lead end-to-end data architecture, including data ingestion, transformation, and orchestration using Azure Data Factory (ADF). • Define data modeling standards (dimensional, Data Vault, 3NF) and implement best practices for Snowflake. • Establish data governance, security, and compliance frameworks with Azure AD, RBAC, and data masking policies. • Optimize query performance, cost management, and scalability in Snowflake. • Integrate Snowflake with Azure services (ADF, ADLS Gen2, Synapse, Logic Apps, Functions, Event Hub, Purview). • Mentor data engineers and developers in Azure + Snowflake best practices. • Drive adoption of DevOps and CI/CD pipelines for Snowflake and ADF workflows. • Partner with stakeholders to define analytics, BI, and reporting solutions using Power BI or other tools. ⸻ Required Skills & Qualifications • Strong expertise in Snowflake architecture, design, and implementation. • Hands-on experience with Azure Data Factory (ADF) – pipelines, triggers, monitoring, and error handling. • Solid understanding of Azure ecosystem: ADLS Gen2, Azure Synapse, Key Vault, Purview, DataBricks (preferred). • Proficiency in SQL, ELT/ETL design, and performance optimization in Snowflake. • Experience with data modeling tools (Erwin, dbt, PowerDesigner, etc.). • Strong understanding of data governance, lineage, and security frameworks in Azure. • Good knowledge of Python, PowerShell, or Scala for scripting and automation. • Excellent problem-solving, communication, and stakeholder management skills. ⸻ Nice-to-Have • SnowPro Core or Advanced Architect certification. • Azure Data Engineer / Azure Solution Architect certification. • Exposure to real-time streaming (Event Hub, Kafka). • Experience integrating with Power BI or Tableau for reporting.
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Data Engineering Lead, you will be responsible for developing and implementing data engineering projects, including enterprise data hubs or Big data platforms. Your role will involve defining reference data architecture, leveraging cloud-native data platforms in AWS or Microsoft stack, and staying updated on the latest data trends like data fabric and data mesh. You will play a key role in leading the Center of Excellence (COE) and influencing client revenues through innovative data and analytics solutions. Your responsibilities will include guiding a team of data engineers, overseeing the design and deployment of data solutions, and strategizing new data services and offerings. Collaborating with client teams to understand their business challenges, you will develop tailored data solutions and lead client engagements from project initiation to deployment. Building strong relationships with key clients and stakeholders, you will also create reusable methodologies, pipelines, and models for more efficient data science projects. Your expertise in data architecture solutions, data governance, and data modeling will ensure compliance with regulatory standards and support effective data management processes. You will be proficient in various data integration tools, cloud computing platforms, programming languages, data visualization tools, and big data technologies to process and analyze large volumes of data. In addition to technical skills, you will demonstrate strong people and interpersonal skills by managing a high-performing team, fostering a culture of innovation, and collaborating with cross-functional teams. Candidates for this role should have at least 10+ years of experience in information technology, with a focus on data engineering and architecture, along with a degree in relevant fields like computer science, data science, or engineering. Candidates should also possess experience in managing data projects, creating data and analytics solutions, and have a good understanding of data visualization, reporting tools, and normalizing data as per key KPIs and metrics. Strong problem-solving, communication, and collaboration skills are essential for success in this role.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
noida, uttar pradesh
On-site
Are you the TYPE Monotype is looking for Monotype, named "One of the Most Innovative Companies in Design" by Fast Company, is a company that brings brands to life through type and technology that consumers engage with daily. With a rich legacy spanning hundreds of years, Monotype features renowned typefaces such as Helvetica, Futura, and Times New Roman, among others. Our focus includes providing a unique service that enhances accessibility to fonts for creative professionals in our increasingly digital world. We collaborate with major global brands and individual creatives, offering a diverse range of solutions to facilitate the design of beautiful brand experiences. Monotype Solutions India serves as a strategic center of excellence for Monotype and has been recognized as a certified Great Place to Work for three consecutive years. This fast-growing center covers various domains including Product Development, Product Management, Experience Design, User Research, Market Intelligence, Artificial Intelligence, Machine Learning, Innovation, Customer Success, Enterprise Business Solutions, and Sales. As part of our team, you will have the opportunity to work on developing next-generation features, products, and services. Collaborating closely with a cross-functional team of engineers, you will focus on microservices and event-driven architectures, contributing to architecture, design, and development of new features while identifying technical risks and providing alternate solutions to challenges. Additionally, the role involves mentoring and leading team members in overcoming technical hurdles. We are seeking individuals who possess: - A Bachelors or Masters degree in Computer Science, Information Systems, or related field. - Minimum 7-9 years of professional experience, with at least 5 years specializing in data architecture. - Proven expertise in designing and implementing data models, including ER diagrams, dimensional modeling, and normalization techniques. - Strong proficiency in relational databases (SQL Server, Oracle, PostgreSQL) and NoSQL databases (MongoDB, Cassandra). - Knowledge of data modeling tools like ERwin, PowerDesigner, or similar tools. - Familiarity with cloud data platforms and services (AWS, Azure, GCP). - Strong analytical and problem-solving skills, with a creative and innovative approach. - Excellent communication and stakeholder management skills. You will have the opportunity to: - Collaborate with global teams to build scalable web-based applications. - Partner closely with the engineering team to adhere to best practices and standards. - Provide reliable solutions using sound problem-solving techniques. - Work with the broader team to maintain high performance, flexible, and scalable web-based applications. - Achieve engineering excellence by implementing standard practices and standards. - Perform technical root cause analysis and suggest corrective actions. Benefits include: - Hybrid work arrangements and competitive paid time off programs. - Comprehensive medical insurance coverage. - Competitive compensation with corporate bonus programs. - A creative, innovative, and global working environment. - Highly engaged Events Committee. - Reward & Recognition Programs. - Professional onboarding program with targeted training. - Development and advancement opportunities. - Retirement planning options and more.,
Posted 2 weeks ago
3.0 years
0 Lacs
noida
On-site
Engineering At Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role We are looking for a highly skilled Senior Data Modeler to join our Gravity project, an enterprise-scale healthcare data platform. This role requires a balance of deep data modeling expertise with hands-on CI/CD implementation experience. As a Senior Data Modeler, you will own the design and evolution of conforming data models across healthcare domains (RCM, PHM, LIS, TMS, etc.), while also enabling automated schema lifecycle management using tools like Liquibase, GitLab, Argo CD, and Snowflake/Databricks CI/CD pipelines. A Day in the Life Design and maintain enterprise-grade data models for healthcare domains (claims, payments, denials, encounters, labs, member, provider). Translate business and regulatory requirements into logical and physical data models in Snowflake/Databricks. Implement schema change automation using Liquibase (Pro/Community), Git-based version control, and CI/CD pipelines. Collaborate with data engineers and DBREs to integrate models into automated deployment workflows. Define DDL standards, naming conventions, referential integrity rules, and tagging strategies. Partner with architecture leadership to ensure models are consistent across domains (RCM, PHM, LIS, Finance). Conduct gap analysis against ERP/RCM source systems (Epic, Oracle Cerner, SAP, Meditech, Workday, etc.) and suggest enhancements. Provide best practices for versioning, branching, and release management of data models. What You Need 3+ years of experience in data modeling, ideally with exposure to healthcare data. Proficiency in SQL and modeling tools (ER/Studio, ERwin, PowerDesigner, etc.). Hands-on experience with Liquibase for schema change management in Snowflake/Databricks/SQL platforms. Strong background in CI/CD pipelines (GitLab, Jenkins, Argo CD, or similar) for database deployments. Proven ability to work with enterprise RCM/ERP source systems (Epic Resolute, Oracle Cerner, SAP, Workday, Meditech). Understanding of healthcare data standards (HL7 FHIR, X12 837/835, ICD-10, CPT/HCPCS). Solid grasp of referential integrity, metadata tagging, data lineage, and governance frameworks. Here’s What We Offer Generous Leaves: Enjoy generous leave benefits of up to 40 days. Parental Leave : Leverage one of industry's best parental leave policies to spend time with your new addition. Sabbatical : Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer comprehensive health insurance to support you and your family, covering medical expenses related to illness, disease, or injury. Extending support to the family members who matter most. Care Program: Whether it’s a celebration or a time of need, we’ve got you covered with care vouchers to mark major life events. Through our Care Vouchers program, employees receive thoughtful gestures for significant personal milestones and moments of need. Financial Assistance : Life happens, and when it does, we’re here to help. Our financial assistance policy offers support through salary advances and personal loans for genuine personal needs, ensuring help is there when you need it most. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer : Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer activates the flow of healthcare data, empowering providers, payers, and government organizations to deliver intelligent and connected experiences that advance health outcomes. The Healthcare Intelligence Cloud equips every stakeholder in the patient journey to turn fragmented data into proactive, coordinated actions that elevate the quality of care and drive operational performance. Leading healthcare organizations like CommonSpirit Health, Atlantic Health, and Banner Health trust Innovaccer to integrate a system of intelligence into their existing infrastructure, extending the human touch in healthcare. For more information, visit www.innovaccer.com. Check us out on YouTube, Glassdoor, LinkedIn, Instagram, and the Web.
Posted 2 weeks ago
3.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Engineering At Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we're shaping the future and making a meaningful impact on the world. About The Role We are looking for a highly skilled Senior Data Modeler to join our Gravity project, an enterprise-scale healthcare data platform. This role requires a balance of deep data modeling expertise with hands-on CI/CD implementation experience. As a Senior Data Modeler, you will own the design and evolution of conforming data models across healthcare domains (RCM, PHM, LIS, TMS, etc.), while also enabling automated schema lifecycle management using tools like Liquibase, GitLab, Argo CD, and Snowflake/Databricks CI/CD pipelines. A Day in the Life Design and maintain enterprise-grade data models for healthcare domains (claims, payments, denials, encounters, labs, member, provider) Translate business and regulatory requirements into logical and physical data models in Snowflake/Databricks Implement schema change automation using Liquibase (Pro/Community), Git-based version control, and CI/CD pipelines Collaborate with data engineers and DBREs to integrate models into automated deployment workflows Define DDL standards, naming conventions, referential integrity rules, and tagging strategies Partner with architecture leadership to ensure models are consistent across domains (RCM, PHM, LIS, Finance) Conduct gap analysis against ERP/RCM source systems (Epic, Oracle Cerner, SAP, Meditech, Workday, etc.) and suggest enhancements Provide best practices for versioning, branching, and release management of data models What You Need 3+ years of experience in data modeling, ideally with exposure to healthcare data Proficiency in SQL and modeling tools (ER/Studio, ERwin, PowerDesigner, etc.) Hands-on experience with Liquibase for schema change management in Snowflake/Databricks/SQL platforms Strong background in CI/CD pipelines (GitLab, Jenkins, Argo CD, or similar) for database deployments Proven ability to work with enterprise RCM/ERP source systems (Epic Resolute, Oracle Cerner, SAP, Workday, Meditech) Understanding of healthcare data standards (HL7 FHIR, X12 837/835, ICD-10, CPT/HCPCS) Solid grasp of referential integrity, metadata tagging, data lineage, and governance frameworks Here's What We Offer Generous Leaves: Enjoy generous leave benefits of up to 40 days Parental Leave: Leverage one of industry's best parental leave policies to spend time with your new addition Sabbatical: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered Health Insurance: We offer comprehensive health insurance to support you and your family, covering medical expenses related to illness, disease, or injury. Extending support to the family members who matter most Care Program: Whether it's a celebration or a time of need, we've got you covered with care vouchers to mark major life events. Through our Care Vouchers program, employees receive thoughtful gestures for significant personal milestones and moments of need Financial Assistance: Life happens, and when it does, we're here to help. Our financial assistance policy offers support through salary advances and personal loans for genuine personal needs, ensuring help is there when you need it most Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer : Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer activates the flow of healthcare data, empowering providers, payers, and government organizations to deliver intelligent and connected experiences that advance health outcomes. The Healthcare Intelligence Cloud equips every stakeholder in the patient journey to turn fragmented data into proactive, coordinated actions that elevate the quality of care and drive operational performance. Leading healthcare organizations like CommonSpirit Health, Atlantic Health, and Banner Health trust Innovaccer to integrate a system of intelligence into their existing infrastructure, extending the human touch in healthcare. For more information, visit www.innovaccer.com. Check us out on YouTube, Glassdoor, LinkedIn, Instagram, and the Web.
Posted 2 weeks ago
12.0 years
0 Lacs
india
On-site
Job Title: Data Modeler About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Requirements Job Title: Data Modeler Experience: 12+ years Location: Hyderabad Approach problems with an open mind and challenge to ensure appropriate pragmatic clean designs. Roles & Responsibilities Minimum of 10 years' experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. A proven track record working in a large and global banking environment is desirable. Demonstrate experience in design data modelling solutions (conceptual, logical and application/messaging) with corresponding phasing, transitions, and migrations where necessary. Good understanding of managing 'data as a product (asset)’ principle across enterprise domains and technology landscapes. Good understanding of architectural domains (business, data, application, and technology) Good communication skills with the ability to influence and present data models (as well as concepts) to technology and business stakeholders. Good collaboration skills with the ability to demonstrate experience achieving outcomes in a matrixed environment partnering with data modellers from other domains to build and join shared and reusable data assets. Experience of working with Agile and Scrum in a large scalable Agile environment. This should include participation and progress reporting in daily standups. Data standards, data governance, data strategy and data lineage would be advantageous in this role. Knowledge of reference/master data management Cloud exposure to solutions implemented in either GCP, AWS or Azure would be beneficial as well as having exposure to big data solutions would be advantageous. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc. Knowledge of data modelling standards and modelling technical documentation using Entity Relationship Diagrams (ERD) or Unified Modelling language (UML) or BIAN. Results oriented with ability to produce solutions that deliver organisational benefit. Understanding of issue and data quality management, prioritisation, business case development, remediation planning and tactical or strategic solution delivery Exposure with data governance initiatives such as lineage, masking, retention policy, and data quality Strong analytical skills and problem-solving, with the ability to work unsupervised and take ownership for key deliverables. Exposure with ETL architectures and tools, including data virtualisation, integration with APIs is desirable.
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Assistant Vice President - Data Modeller. Principal Responsibilities The FDS Data Team are seeking to recruit a Data Modeller with a passion for organising and transforming complex Finance data into actionable insights represented within data model structures that are fit for purpose. The role requires a strong analytical mindset, a good understanding of various data modelling techniques and tools with a proven track record. The individual should have exposure of designing and implementing efficient data models that cater to the data sourcing, storage and usage needs of the Finance business and/or Front-to-Back business domains within a global financial institution. Support the design and develop FDS conceptual, logical and application data models as per HSBC's Future State Architecture (Data Asset Strategy) and work across Finance business teams to drive understanding, interpretation, design, and implementation. Support Finance business and change teams to migrate to target state data models and Data Asset delivery, driving improvement on current feeds and data issues. Develop data modelling schemas aligned with Enterprise data models and supporting Finance Data Assets. Contribute to FDS program model development planning and scheduling. Continuously improve FDS data modelling estate adhering to risks, controls, security, and regulatory compliance standards. Advise and support Finance modelling data requirements that support new use case and data changes. Serve as FDS data modelling subject matter expert. Seek opportunities to simplify, automate, rationalise, and improve the efficiency of Finance IT and modelling solutions. Update and maintain the key FDS modelling artefacts, (i.e., Confluence, SharePoint, documents, reports, roadmap, and other domain artefacts). Provide data modelling and technical advice as well as maintain ongoing relationships. Provide feedback in a timely manner to ensure that model development or modification meets the business need. Requirements Minimum of 5 years' experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. Good communication skills with the ability to influence and present data models (as well as concepts) to technology and business stakeholders. Good collaboration skills with the ability to demonstrate experience achieving outcomes in a matrixed environment partnering with data modellers from other domains to build and join shared and reusable data assets. Experience of working with Agile and Scrum in a large scalable Agile environment. This should include participation and progress reporting in daily standups. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc. Knowledge of data modelling standards and modelling technical documentation using Entity Relationship Diagrams (ERD) or Unified Modelling language (UML) or BIAN. Results oriented with ability to produce solutions that deliver organisational benefit. Understanding of issue and data quality management, prioritisation, business case development, remediation planning and tactical or strategic solution delivery Exposure with data governance initiatives such as lineage, masking, retention policy, and data quality Strong analytical skills and problem-solving, with the ability to work unsupervised and take ownership for key deliverables. You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 1 month ago
0.0 - 12.0 years
0 Lacs
Karnataka
On-site
Job Description We are looking for a skilled Senior Data Modeller who can design, develop, and maintain robust data models that support enterprise-scale data management, analytics, and reporting solutions. Skilled in translating complex business requirements into scalable and efficient conceptual, logical, and physical data models across transactional and analytical systems. Proficient in data warehousing, dimensional modelling (star and snowflake schemas), and metadata management, with hands-on experience in a wide range of tools and platforms including Erwin, PowerDesigner, dbt, any RDMS, MongoDB , and cloud environments (AWS/Azure/GCP) . Proven ability to collaborate across business, engineering, and analytics teams to ensure data integrity, performance, and governance. Key Responsibilities: Design and Develop Data Models Create conceptual, logical, and physical data models to represent business entities and relationships. Define data structures that support both operational (OLTP) and analytical (OLAP) use cases. Collaborate with Stakeholders Work closely with business analysts, data architects, data engineers, and BI teams to understand data requirements. Translate business needs into scalable, maintainable data architecture. Ensure Data Quality and Consistency Define and enforce naming standards, data types, constraints, and normalization rules. Help implement master data and reference data structures for consistency across systems. Support Data Governance Participate in or lead efforts around data governance, data lineage, and metadata management. Maintain data dictionaries and documentation for modelled data assets. Classify PCI datasets with right labels and policies around it Ensure GDPR Compliance is in place at the data model Optimize Data Performance Design models for high performance and scalability, considering indexing, partitioning, and access patterns. Work with DBAs and engineers to tune queries and ensure model efficiency. Model for Data Warehousing & BI Create dimensional models (e.g., star and snowflake schemas) for use in data warehouses, data marts, and BI tools. Support ETL/ELT design by providing source-to-target mappings and transformation logic. Maintain and Evolve Models Version-control and manage changes to data models as business requirements evolve. Conduct impact analysis for changes to existing data models or systems. Requirements Experience and Skills: 8 to 12 years of experience in design, develop, and maintenance of different data models Experience in building Conceptual, Logical & Physical Data Models including dimensional Modelling (Star, Snowflake) Experience in building Data Warehouse & assisting in ETL Design Proven experience in Data Governance & Quality Standards Proven experience in Master & Reference Data Modelling Strong hands-on with Mongo CRUD, aggregation pipelines and experience handing data using js. Strong hands-on with SQL (complex queries), dbt, Python (for data transformation) Experience in Data Modeling Tools like Erwin, ER/Studio, PowerDesigner, Lucidchart (Any one) Databases: MongoDB and any RDMS is mandatory Cloud: Expertise in Data services (AWS, AZURE, GCP) Organisation Facctum IT Solutions India Private Limited Job Type Full time Industry IT Services Work Experience 5-10 Years Date Opened 08/07/2025 City Bengaluru State/Province Karnataka Country India Zip/Postal Code 560002
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
Remote
About the Company Creospan is a growing tech collective of makers, shakers, and problem solvers, offering solutions today that will propel businesses into a better tomorrow. “Tomorrow’s ideas, built today!” In addition to being able to work alongside equally brilliant and motivated developers, our consultants appreciate the opportunity to learn and apply new skills and methodologies to different clients and industries. Job Title: Data Modeler Location: Pune (Pan India relocation is considerable - High preference is Pune) Hybrid: 3 days WFO & 2 days WFH Shift timings: UK Working Hours (9AM — 5PM GMT) Notice period: Immediate Gap: Upto 3 Months (Strictly not more than that) Project Overview: Creation and management of business data models in all their forms, including conceptual models, logical data models and physical data models (relational database designs, message models and others). Expert level understanding of relational database concepts, dimensional database concepts and database architecture and design, ontology and taxonomy design. Background working with key data domains as account, holding and transactions within security servicing or asset management space. Expertise in designing data driven solution on Snowflake for complex business needs. Knowledge of entire application lifecycle including Design, Development, Deployment, Operation and Maintenance in an Agile and DevOps culture. Role: This person strengthens the impact of, and provides recommendations on data-models and architecture that will need to be available and shared consistently across the TA organization through the identification, definition and analysis of how data related assets aid business outcomes. The Data Modeler\Architect is responsible for making data trusted, understood and easy to use. They will be responsible for the entire lifecycle of the data architectural assets, from design and development to deployment, operation and maintenance, with a focus on automation and quality. Must Have Skills: 10+ years of experience in Enterprise-level Data Architecture, Data Modelling, and Database Engineering Expertise in OLAP & OLTP design, Data Warehouse solutions, ELT/ETL processes Proficiency in data modelling concepts and practices such as normalization, denormalization, and dimensional modelling (Star Schema, Snowflake Schema, Data Vault, Medallion Data Lake) Experience with Snowflake-specific features, including clustering, partitioning, and schema design best practices Proficiency in Enterprise Modelling tools - Erwin, PowerDesigner, IBM Infosphere etc. Strong experience in Microsoft Azure data pipelines (Data Factory, Synapse, SQL DB, Cosmos DB, Databricks) Familiarity with Snowflake’s native tools and services including Snowflake Data Sharing, Snowflake Streams & Tasks, and Snowflake Secure Data Sharing Strong knowledge of SQL performance tuning, query optimization, and indexing strategies Strong verbal and written communication skills for collaborating with both technical teams and business stakeholders Working knowledge of BIAN, ACORD, ESG risk data integration Nice to Haves: At least 3+ in security servicing or asset Management/investment experience is highly desired Understanding of software development life cycle including planning, development, quality assurance, change management and release management Strong problem-solving skills and ability to troubleshoot complex issues Excellent communication and collaboration skills to work effectively in a team environment Self-motivated and ability to work independently with minimal supervision Excellent communication skills: experience in communicating with tech and non-tech teams Deep understanding of data and information architecture, especially in asset management space Familiarity with MDM, data vault, and data warehouse design and implementation techniques Business domain, data/content and process understanding (which are more important than technical skills). Being techno functional is a plus Good presentation skills in creating Data Architecture diagrams Data modelling and information classification expertise at the project and enterprise level Understanding of common information architecture frameworks and information models Experience with distributed data and analytics platforms in cloud and hybrid environments. Also an understanding of a variety of data access and analytic approaches (for example, microservices and event-based architectures) Knowledge of problem analysis, structured analysis and design, and programming techniques Python, R
Posted 1 month ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We're Hiring – Data Modelers Preferred Location: Noida (Primary) | Bangalore | Pan India Experience Required: ~8+ years in data modeling, data warehousing, and ETL ~Strong experience with ER Studio, data visualization, and SQL-based RDBMS ~Familiarity with data governance, cloud platforms (AWS/Azure/GCP) is a plus ~Knowledge of modeling methodologies (ER, Dimensional, Relational) ~Tools: ERwin, PowerDesigner, Lucidchart, Visio Key Responsibilities: ~Develop conceptual, logical, and physical data models for data lakes, warehouses, and analytics ~Translate business requirements into efficient data structures ~Collaborate with business, engineering, and analytics teams for requirements and delivery ~Drive data governance, data quality assessments, and metadata management ~Optimize performance with indexing, partitioning, and tuning strategies ~Maintain documentation (ERDs, data dictionaries, flow diagrams)
Posted 1 month ago
2.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Principal Technologist (Data Architect) at Medtronic, you will be responsible for delivering data architecture solutions that align with business capability needs and enterprise standards. In this role, you will collaborate with Enterprise Solution Architects, Business Solution Architects, Technical Architects, and external service providers to ensure that data and information models and technologies are in line with architecture strategies and Medtronic's standards. Your role will involve working with Business Analysts to review business capability needs, define requirements, conduct data analysis, develop data models, write technical specifications, and collaborate with development teams to ensure the successful delivery of designs. Your technical expertise will be crucial in leveraging tools such as webMethods suite, Informatica, ETL tools, Kafka, and data transformation techniques to design and implement robust integration solutions. You will oversee the implementation of integration solutions, ensuring they meet technical specifications, quality standards, and best practices. Additionally, you will lead continuous improvement initiatives to enhance integration processes, troubleshoot and resolve integration-related issues, mentor junior team members, collaborate with vendors, optimize performance, and contribute to documentation and knowledge management efforts. To be successful in this role, you should have at least 8 years of IT experience with a Bachelor's Degree in Engineering, MCA, or MSc. You should also have experience in relevant architecture disciplines (integrations, data, services, infrastructure), Oracle, SAP, or big data platforms, Informatica, PowerDesigner, Python coding, and Snowflake. Specialized knowledge in Enterprise-class architecture concepts, data integration, data modeling methodologies, cloud-based solutions, and data governance would be advantageous. It would be beneficial to have a high degree of learning agility, experience with large enterprise systems, technical modeling and design skills, awareness of architecture frameworks, and strong leadership, teamwork, analytical, and communication skills. Experience in the Medical Device Industry or other regulated industries, as well as the ability to work independently and collaboratively, would also be valuable. At Medtronic, we offer a competitive salary, flexible benefits package, and a commitment to recognizing and supporting the contributions of our employees. Our mission is to alleviate pain, restore health, and extend life by boldly addressing the most challenging health problems. As part of our global team of passionate individuals, you will have the opportunity to engineer real solutions for real people and contribute to our mission of making healthcare technology accessible to all. Join us at Medtronic and be a part of a team that is dedicated to innovation, collaboration, and making a meaningful impact on global healthcare technology.,
Posted 1 month ago
10.0 years
0 Lacs
Dehradun, Uttarakhand, India
On-site
Key Responsibilities - Familiarity with modern storage formats like Parquet and ORC. Design and develop conceptual, logical, and physical data models to support enterprise data initiatives. Build, maintain, and optimize data models within Databricks Unity Catalog. Develop efficient data structures using Delta Lake, optimizing for performance, scalability, and reusability. Collaborate with data engineers, architects, analysts, and stakeholders to ensure data model alignment with ingestion pipelines and business goals. Translate business and reporting requirements into robust data architecture using best practices in data warehousing and Lakehouse design. Maintain comprehensive metadata artifacts including data dictionaries, data lineage, and modeling documentation. Enforce and support data governance, data quality, and security protocols across data ecosystems. Continuously evaluate and improve modeling processes and Skills and Experience : 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficient in modeling methodologies including Kimball, Inmon, and Data Vault. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart. Proven experience in Databricks with Unity Catalog and Delta Lake. Strong command of SQL and Apache Spark for querying and transformation. Hands-on experience with the Azure Data Platform, including : Azure Data Factory Azure Data Lake Storage Azure Synapse Analytics Azure SQL Database Exposure to Azure Purview or similar data cataloging tools. Strong communication and documentation skills, with the ability to work in cross-functional agile Qualifications : Bachelor's or Masters degree in Computer Science, Information Systems, Data Engineering, or related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure. Experience working in agile/scrum environments. Exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) is a plus. (ref:hirist.tech)
Posted 1 month ago
15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
P3-C3-TSTS 15+ years of Senior data modeling experience, Erwin data modelling tool Experience with AWS and Redshift database models is a plus Designing L2 Layer experience for Large Data Transformation Program Experience of working as a Lead Data Modeler and working with senior stakeholders Knowledge of relational databases and data architecture computer systems, including SQL Experience of ER modeling, big data, enterprise data, and physical data models Experience of working with large data sets. Banking experience is a must. Finance data experience, Anaplan experience is a plus. Familiarity with data modeling software such as SAP PowerDesigner, Microsoft Visio, or erwin DataModeler Excellent presentation, communication, and organizational skills Strong attention to detail Ability to work in a fast-paced environment
Posted 1 month ago
10.0 years
6 - 8 Lacs
Hyderābād
On-site
Job description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Vice President - Data Modeller Business :Finance Location- Hyderabad/Bangalore/Chennai/Pune/Gurgaon Principal responsibilities The FDS Data Team are seeking to recruit a Data Modeller with a passion for organising and transforming complex Finance data into actionable insights represented within data model structures that are fit for purpose. The role requires a strong analytical mindset, a good understanding of various data modelling techniques and tools with a proven track record. The individual should have exposure of designing and implementing efficient data models that cater to the data sourcing, storage and usage needs of the Finance business and/or Front-to-Back business domains within a global financial institution. Support the design and develop FDS conceptual, logical and application data models as per HSBC's Future State Architecture (Data Asset Strategy) and work across Finance business teams to drive understanding, interpretation, design, and implementation. Support Finance business and change teams to migrate to target state data models and Data Asset delivery, driving improvement on current feeds and data issues. Develop data modelling schemas aligned with Enterprise data models and supporting Finance Data Assets. Contribute to FDS program model development planning and scheduling. Continuously improve FDS data modelling estate adhering to risks, controls, security, and regulatory compliance standards. Advise and support Finance modelling data requirements that support new use case and data changes. Serve as FDS data modelling subject matter expert. Support Finance adoption and implementation of FDS data models to meet HSBC's strategic data needs. Participate in data modelling and data architecture governance forums. Coordinate and collaborate with cross-functional teams, change delivery teams, technology teams and subject matter expert - stakeholders. Create and maintain a range of data modelling documents including model requirement, data flow diagrams, data catalogues, data definitions, design specification, data models, traceability matrices, data quality rules and more. Be responsible for translating the Finance business requirements into the data modelling solution and Finance Data Assets. Conduct a continuous audit of data models and refine whenever required, this includes the reporting of any challenges issues or risks to senior management. Ensure Finance data models align with Enterprise data models and adhere to Enterprise Architecture principles and standards. Seek opportunities to simplify, automate, rationalise, and improve the efficiency of Finance IT and modelling solutions. Update and maintain the key FDS modelling artefacts, (i.e., Confluence, SharePoint, documents, reports, roadmap, and other domain artefacts). Provide data modelling and technical advice as well as maintain ongoing relationships. Provide feedback in a timely manner to ensure that model development or modification meets the business need. Communicate FDS data modelling solutions to both technical and non-technical audiences, ensuring the communication style is appropriate for the intended audience. Requirements Minimum of 10 years' experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. A proven track record working in a large and global banking environment is desirable. Demonstrate experience in design data modelling solutions (conceptual, logical and application/messaging) with corresponding phasing, transitions, and migrations where necessary. Good understanding of managing 'data as a product (asset)’ principle across enterprise domains and technology landscapes. Good understanding of architectural domains (business, data, application, and technology) Experience of working with Agile and Scrum in a large scalable Agile environment Knowledge of reference/master data management Good collaboration skills with the ability to demonstrate experience achieving outcomes in a matrixed environment partnering with data modellers from other domains to build and join shared and reusable data assets. Technical Skills Data standards, data governance, data strategy and data lineage would be advantageous in this role Cloud exposure to solutions implemented in either GCP, AWS or Azure would be beneficial as well as having exposure to big data solutions would be advantageous. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc Knowledge of data modelling standards and modelling technical documentation using Entity Relationship Diagrams (ERD) or Unified Modelling language (UML) or BIAN. Understanding of issue and data quality management, prioritisation, business case development, remediation planning and tactical or strategic solution delivery Exposure with data governance initiatives such as lineage, masking, retention policy, and data quality Strong analytical skills and problem-solving, with the ability to work unsupervised and take ownership for key deliverables. Exposure with ETL architectures and tools, including data virtualisation, integration with APIs is desirable. You’ll achieve more when you join HSBC. HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. ***Issued by HSBC Electronic Data Processing (India) Private LTD***
Posted 1 month ago
6.0 - 8.0 years
0 Lacs
Hyderābād
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Data Management Technology and Enterprise Architecture (DMTEA) - DQCE GT GO GSEP group is responsible for ensuring technology provides accurate, timely, and complete Global Technology data; governs common DQCE processes. GT Data Modelling under DQCE portfolio is responsible for Global Technology (GT) Logical Data Architecture and Model Governance, identifying and compiling use cases that reflect GT data needs, simplifying GT data sourcing strategies, managing the GT Conceptual and Logical Data Models, including alignment to data families and domains, and GT ADS and Platform product management. Job Description* This job is responsible for defining an architectural vision and solution that supports the strategic outcomes of the Business Products and Services. Key responsibilities include defining the target operating environment, designing for client resiliency, assisting with solution design, and defining non-functional requirements. Job expectations include working with stakeholders and service providers aligned to the Business' strategic objectives, evaluating the impact of strategic design decisions, and contributing to the architecture roadmap. Specifically, this role will be responsible for producing and maintaining conceptual, logical, and physical data models, and physical data architecture of operational data stores, data warehouses and data marts. Participate in design, development, and implementation of architectural deliverables, to include components of the assessment and optimization of system design and review of user requirements. Responsibilities* Collaborate with Stakeholders: Work closely with business representatives, data architects, analysts, and development teams to gather, analyze, and translate business requirements into data models. Design Data Models: Create conceptual, logical, and physical data models that define the structure, relationships, and rules within databases (including relational, dimensional, and NoSQL databases). Maintain and Optimize Models: Develop, update, and maintain data models, schemas, data dictionaries, and documentation to reflect current and future business needs. Ensure Data Integrity and Quality: Implement best practices for data governance, data quality, and metadata management to ensure consistency, accuracy, and security of data. Performance Optimization: Troubleshoot, optimize, and tune database performance, including query optimization, indexing strategies, and data integration processes. Data Integration and Migration: Assist in data migration projects, integration of new systems, and the implementation of data warehousing and business intelligence solutions. Develop Standards: Define and enforce data modeling standards, naming conventions, and coding practices to maintain consistency across the organization. Evaluate and Improve Systems: Review existing data systems for variances, discrepancies, and inefficiencies; propose and implement improvements. Reverse Engineering: Extract and document data models from existing legacy systems when necessary. Support Data Governance: Contribute to data governance initiatives and collaborate with IT leadership on enterprise data strategy Minimum of 6-8 years in Data Modeling, Data Warehousing, Dimensional and Relational Modeling, Semantic Modeling and Metadata Management Minimum of 6-8 years in Data Modeling, Data Warehousing, Dimensional and Relational Modeling, Semantic Modeling and Metadata Management Works across the business, operations and technology to create the solution intent and architectural vision for complex solutions and prioritize functional and non-functional requirements into a technology backlog to enable the technology roadmap and functionality to support evolving capabilities and services Contributes to the creation of the architecture roadmap of defined domains (Business, Application, Data, and Technology) in support of the product roadmap and the development of best practices including standardized templates Clarifies the architecture, assists with system design to support implementation, and provides solution options to resolve any architectural impediments Facilitates solution driven discussions, leads the design of complex architectures, and finds creative solutions through knowledge of domain, practical experiments, and proof of concepts while ensuring architecture is flexible, modular, and adaptable Ability to educate and present on data management standards, data model practices, standardization strategies, and best practices to create innovative solutions Supports the team as needed to select the technology stack required for solutions and helps select preferred technology products. Performs design and code reviews to ensure all non-functional requirements are sufficiently met (for example, security, performance, maintainability, scalability, usability, and reliability) Extensive Experience with Sybase PowerDesigner or Erwin Experience in database development, design, and best practices Requirements* Education* Graduation / Post Graduation Certifications If Any: Experience Range* 10-15 Years Foundational Skills* Data Modeling Regulatory Compliance Solution Design Desired Skills* Certifications in Financial domain Work Timings* 11:30 AM to 08:30 PM IST Job Location* Chennai, Hyderabad
Posted 1 month ago
6.0 years
0 Lacs
Delhi, India
Remote
Job Title: Senior Data Modeler Experience Required: 6+ Years Location: Remote Employment Type: Full-time / Contract (Remote) Domain: Data Engineering / Analytics / Data Warehousing --- Job Summary: We are seeking an experienced and detail-oriented Data Modeler with a strong background in conceptual, logical, and physical data modeling. The ideal candidate will have in-depth knowledge of Snowflake architecture, data modeling best practices (Star/Snowflake schema), and advanced SQL scripting. You will be responsible for designing robust, scalable data models and working closely with data engineers, analysts, and business stakeholders. --- Key Responsibilities: 1. Data Modeling: Design Conceptual, Logical, and Physical Data Models. Create and maintain Star and Snowflake Schemas for analytical reporting. Perform Normalization and Denormalization based on performance and reporting requirements. Work closely with business stakeholders to translate requirements into optimized data structures. Maintain data model documentation and data dictionary. 2. Snowflake Expertise: Design and implement Snowflake schemas with optimal partitioning and clustering strategies. Perform performance tuning for complex queries and storage optimization. Implement Time Travel, Streams, and Tasks for data recovery and pipeline automation. Manage and secure data using Secure Views and Materialized Views. Optimize usage of Virtual Warehouses and storage costs. 3. SQL & Scripting: Write and maintain Advanced SQL queries including: Common Table Expressions (CTEs) Window Functions Recursive queries Build automation scripts for data loading, transformation, and validation. Troubleshoot and optimize SQL queries for performance and accuracy. Support data migration and integration projects. --- Required Skills & Qualifications: 6+ years of experience in Data Modeling and Data Warehouse design. Proven experience with Snowflake platform (min. 2 years). Strong hands-on experience in Dimensional Modeling (Star/Snowflake schemas). Expert in SQL and scripting for automation and performance optimization. Familiarity with tools like Erwin, PowerDesigner, or similar data modeling tools. Experience working in Agile/Scrum environments. Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. --- Preferred Skills (Nice to Have): Experience with ETL/ELT tools like dbt, Informatica, Talend, etc. Exposure to Cloud Platforms like AWS, Azure, or GCP. Familiarity with Data Governance and Data Quality frameworks.
Posted 1 month ago
6.0 years
12 - 18 Lacs
Delhi, India
Remote
Skills: Data Modeling, Snowflake, Schemas, Star Schema Design, SQL, Data Integration, Job Title: Senior Data Modeler Experience Required: 6+ Years Location: Remote Employment Type: Full-time / Contract (Remote) Domain: Data Engineering / Analytics / Data Warehousing Job Summary We are seeking an experienced and detail-oriented Data Modeler with a strong background in conceptual, logical, and physical data modeling. The ideal candidate will have in-depth knowledge of Snowflake architecture, data modeling best practices (Star/Snowflake schema), and advanced SQL scripting. You will be responsible for designing robust, scalable data models and working closely with data engineers, analysts, and business stakeholders. Key Responsibilities Data Modeling: Design Conceptual, Logical, and Physical Data Models. Create and maintain Star and Snowflake Schemas for analytical reporting. Perform Normalization and Denormalization based on performance and reporting requirements. Work closely with business stakeholders to translate requirements into optimized data structures. Maintain data model documentation and data dictionary. Snowflake Expertise: Design and implement Snowflake schemas with optimal partitioning and clustering strategies. Perform performance tuning for complex queries and storage optimization. Implement Time Travel, Streams, and Tasks for data recovery and pipeline automation. Manage and secure data using Secure Views and Materialized Views. Optimize usage of Virtual Warehouses and storage costs. SQL & Scripting: Write And Maintain Advanced SQL Queries Including Common Table Expressions (CTEs) Window Functions Recursive queries Build automation scripts for data loading, transformation, and validation. Troubleshoot and optimize SQL queries for performance and accuracy. Support data migration and integration projects. Required Skills & Qualifications 6+ years of experience in Data Modeling and Data Warehouse design. Proven experience with Snowflake platform (min. 2 years). Strong hands-on experience in Dimensional Modeling (Star/Snowflake schemas). Expert in SQL and scripting for automation and performance optimization. Familiarity with tools like Erwin, PowerDesigner, or similar data modeling tools. Experience working in Agile/Scrum environments. Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Preferred Skills (Nice To Have) Experience with ETL/ELT tools like dbt, Informatica, Talend, etc. Exposure to Cloud Platforms like AWS, Azure, or GCP. Familiarity with Data Governance and Data Quality frameworks.
Posted 1 month ago
10.0 years
0 Lacs
Dehradun, Uttarakhand, India
On-site
We are looking for a skilled Data Modeller with strong experience in the Big Data ecosystem, particularly in the Azure Data Platform and Databricks environment. The ideal candidate should have a deep understanding of data modelling principles and hands-on expertise in building models in modern data architectures such as Unity Catalog and Delta Lake. Key Responsibilities: Design and develop conceptual, logical, and physical data models to support enterprise analytics and reporting needs Build and manage data models in Unity Catalog within the Databricks environment Work across teams to model and structure data in Delta Lake and optimize for performance and reusability Collaborate with data engineers, architects, and analysts to ensure models align with data ingestion, transformation, and business reporting workflows Translate business requirements into scalable and efficient data designs using best practices in data warehousing and Lakehouse architecture Maintain comprehensive documentation, including data dictionaries, data lineage, and metadata Implement and support data governance, data quality, and security controls across datasets and platforms Qualifications and Skills: 10+ years of Hands-on data modelling experience in the Big Data ecosystem, with a strong understanding of OLTP, OLAP, and dimensional modelling Hands – on experience in Data modelling techniques like Kimball, Inmon, Data vault and Dimensional Strong proficiency in data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner, dbt, SQLDBM, or Lucidchart) Experience building and maintaining data models using Unity Catalog in Databricks Proven experience working with the Azure Data Platform, including services like: Azure Data Factory Azure Data Lake Azure Synapse Analytics Azure SQL Database Strong proficiency in SQL and Apache Spark for data transformation and querying Familiarity with Delta Lake, Parquet, and modern data storage formats Knowledge of data cataloging tools such as Azure Purview is a plus Excellent problem-solving skills and ability to work in agile and fast-paced environments Strong communication skills to articulate data concepts to technical and non-technical stakeholders Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field Relevant certifications such as DP-203 (Azure Data Engin About Us: We’re an international team who specialize in building technology products & then helping brands grow with multi-channel demand generation marketing. We have in-house experience working for Fortune companies, e-commerce brands, technology SaaS companies. We have assisted over a dozen billion dollar companies with consulting, technology, operations, and digital agency capabilities in managing their unique brand online. We have a fun and friendly work culture that also encourages employees personally and professionally. EbizON has many values that are important to our success as a company: integrity, creativity, innovation, mindfulness and teamwork. We thrive on the idea of making life better for people by providing them with peace of mind. The people here love what they do because everyone from management all way down understands how much it means living up close-to someones' ideals which allows every day feel less stressful knowing each person has somebody cheering him. Equal Opportunity Employer: EbizON is committed to providing equal opportunity for all employees, and we will consider any qualified applicant without regard to race or other prohibited characteristics. Flexible Timings: Flexible working hours are the new normal. We at EbizON believe giving employees freedom to choose when to work, how to work. It helps them thrive and also balance their life better. Global Clients Exposure: Our goal is to provide excellent customer service and we want our employees to work closely with clients from around the world. That's why you'll find us working closely with clients from around the world through Microsoft Teams, Zoom and other video conferencing tools. Retreats & Celebrations: With annual retreats, quarterly town halls and festive celebrations we have a lot of opportunities to get together. Powered by JazzHR ndhkjYTwXs
Posted 1 month ago
10.0 years
0 Lacs
Hyderābād
On-site
Job description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Chief Data Architect Business: Finance Function Principal responsibilities The Financial Data Service (FDS) Team are seeking to recruit a Data Modeller with a passion for organising and transforming complex Finance data into actionable insights represented within data model structures that are fit for purpose. The role requires a strong analytical mindset, a good understanding of various data modelling techniques and tools with a proven track record. The individual should have exposure of designing and implementing efficient data models that cater to the data sourcing, storage and usage needs of the Finance business and/or Front-to-Back business domains within a global financial institution. Support the design and develop FDS conceptual, logical and application data models as per HSBC's Future State Architecture (Data Asset Strategy) and work across Finance business teams to drive understanding, interpretation, design, and implementation. Support Finance business and change teams to migrate to target state data models and Data Asset delivery, driving improvement on current feeds and data issues. Develop data modelling schemas aligned with Enterprise data models and supporting Finance Data Assets. Contribute to FDS program model development planning and scheduling. Continuously improve FDS data modelling estate adhering to risks, controls, security, and regulatory compliance standards. Advise and support Finance modelling data requirements that support new use case and data changes. Serve as FDS data modelling subject matter expert. Support Finance adoption and implementation of FDS data models to meet HSBC's strategic data needs. Participate in data modelling and data architecture governance forums. Coordinate and collaborate with cross-functional teams, change delivery teams, technology teams and subject matter expert - stakeholders. Create and maintain a range of data modelling documents including model requirement, data flow diagrams, data catalogues, data definitions, design specification, data models, traceability matrices, data quality rules and more. Be responsible for translating the Finance business requirements into the data modelling solution and Finance Data Assets. Conduct a continuous audit of data models and refine whenever required, this includes the reporting of any challenges issues or risks to senior management. Ensure Finance data models align with Enterprise data models and adhere to Enterprise Architecture principles and standards. Requirements Minimum of 10 years' experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. A proven track record working in a large and global banking environment is desirable. Demonstrate experience in design data modelling solutions (conceptual, logical and application/messaging) with corresponding phasing, transitions, and migrations where necessary. Good understanding of managing 'data as a product (asset)’ principle across enterprise domains and technology landscapes. Good understanding of architectural domains (business, data, application, and technology) Good communication skills with the ability to influence and present data models (as well as concepts) to technology and business stakeholders. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc. Good collaboration skills with the ability to demonstrate experience achieving outcomes in a matrixed environment partnering with data modellers from other domains to build and join shared and reusable data assets. Experience of working with Agile and Scrum in a large scalable Agile environment. This should include participation and progress reporting in daily standups. Cloud exposure to solutions implemented in either GCP (Google Cloud Platform), AWS (Amazon Web Services) Azure would be beneficial as well as having exposure to big data solutions would be advantageous. Exposure with data governance initiatives such as lineage, masking, retention policy, and data quality You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. ***Issued By HSBC Electronic Data Processing (India) Private LTD***
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |