Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8 - 13 years
15 - 30 Lacs
Pune, Trivandrum
Work from Office
Role & responsibilities • 8+ years of experience in enterprise data models, data structures, data engineering, or database design, management experience is a plus • Proficiency in data modelling for data from different domains( Data Vault, 3NF, Snowflakes, Star) • Proficient in SQL and RDBMS (e.g., MySQL, PostgreSQL, Oracle, SQL Server, Synapse). • Knowledge in Synapse analytics and large data model. • Strong experience in data warehousing and big data technologies. Data Mesh/Data Product experience a plus • Familiar with cloud data services and infrastructure (Azure, AWS) • Knowledgeable in data modeling tools (e.g, ERwin). • Excellent communication skills for both technical and non-technical audiences. • Bachelors degree or higher in Computer Science, Engineering, Information Systems, or a related field.
Posted 2 months ago
3 - 6 years
4 - 7 Lacs
Bengaluru
Work from Office
Description India Template - R2D2 Global Interfaces Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade D Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Erwin Data Modeler Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
6 - 10 years
11 - 15 Lacs
Bengaluru
Work from Office
Description India Template - R2D2 Global Interfaces Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade D Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills data architecture;AWS Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
11 - 15 years
40 - 45 Lacs
Hyderabad
Work from Office
Software Engineering Advisor Data Governance, Data Model, Data Migration, Automation Position Overview: The job profile for this position is Software Engineering Advisor, which is a Band 4 Contributor Career Track Role. Excited to grow your career ? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. We are looking for exceptional Data Model, Data Governance, and Data Migration experts including expertise in automation, in our PBM Plus Technology organization. This role requires ensuring data integrity, efficiency, and compliance across the project as well as designing and implementing robust data governance frameworks, leading data migration projects, and developing automation scripts to enhance data processing and management. This role involves working with critical data across customer, provider, claims, and benefits domain to ensure comprehensive data solutions and high-quality deliverables. and deploying in on prem and/or AWS infrastructure using the technologies listed below. They are expected to work closely with Subject Matter Experts, developers, and business stakeholders to ensure that application solutions meet business/customer requirements. Responsibilities: Data Governance: Design and implement comprehensive data governance frameworks and policies. Ensure adherence to data governance standards and best practices across the organization. Collaborate with data stewards and stakeholders to enforce data policies and procedures. Data Modeling: Develop and maintain logical and physical data models for enterprise data warehouses, data lakes and data marts. Ensure data models are optimized for performance and scalability. Document data models and maintain metadata repositories. Data Migration: Lead data migration projects, ensuring data accuracy, consistency, and completeness. Develop and execute data migration strategies and plans. Perform data extraction, transformation, and loading (ETL) using industry-standard tools. Automation: Develop automation scripts and tools to streamline data processing and management tasks. Implement automated data quality checks and validations. Continuously improve and optimize data automation processes. Collaboration: Work closely with data architects, data analysts, and other IT teams to ensure seamless data integration and consistency. Provide technical guidance and support to junior team members. Support other product delivery partners in the successful build, test, and release of solutions. Work with distributed requirements and technical stakeholders to complete shared design and development. Works with both onsite (Scrum Master, Product, QA and Developers) and on and offshore team members in properly defining testable scenarios based on requirements/acceptance criteria. Be part of a fast-moving team, working with the latest tools and open-source technologies Work on a development team using agile methodologies. Understand the Business and the Application Architecture End to End Solve problems by crafting software solutions using maintainable and modular code. Participate in daily team standup meetings where you'll give and receive updates on the current backlog and challenges. Participate in code reviews. Ensure Code Quality and Deliverables Provide Impact analysis for new requirements or changes. Responsible for low level design with the team Qualifications Required Skills: Extensive experience in data modeling, governance, and migration and proficient in data management tools (e.g. Erwin, etc.) Technology Stack: Python, Py-Spark, Lambda, AWS Glue, Redshift, Athena, SQL Proficient in working with the SAM (Serverless Application Model) framework, with a strong command of Lambda functions using Java/Python and programming and scripting skills (Python, SQL, Shell scripting) Proficient in internal integration within AWS ecosystem using Lambda functions, leveraging services such as Event Bridge, S3, SQS, SNS, and others. Experienced in internal integration within AWS using DynamoDB with Lambda functions, demonstrating the ability to architect and implement robust serverless applications. CI/CD experience: must have GitHub experience. Recognized internally as the go-to person for the most complex software engineering assignments Required Experience & Education: 11+ years of experience Experience with vendor management in an onshore/offshore model, including managing SLAs and contracts with third-party vendors Proven experience with architecture, design, and development of large-scale enterprise application solutions. College degree (Bachelor) in related technical/business areas or equivalent work experience. Industry certifications (e.g. CDMP, DGSP, etc.)
Posted 2 months ago
5 - 10 years
22 - 37 Lacs
Pune, Hyderabad
Hybrid
Greetings from InfoVision...!!! We, InfoVision, looking forward to fulfill the position for Data Modeler wherein main skills and details are below. Please apply for this position, If you feel you are good enough. Role: Data Modeler Position: Full-Time Permanent role Work-Mode: Hybrid Model work (4 Days WFO) Job Locations: Pune and Hyderabad About InfoVision: Infovision, founded in 1995, is a leading global IT services company offering enterprise digital transformation and modernization solutions across business verticals. We partner with our clients in driving innovation, rethinking workflows, and transforming experiences so businesses can stay ahead in a rapidly changing world. We help shape a bold new area or era of technology led disruption accelerating digital with quality, agility and integrity. We have helped more than 75 global leaders across Telecom, Retail, Banking, Healthcare and Technology Industries deliver excellence for their customers. InfoVisions global presence enables us to offer offshore, near shore and onshore solutions for our customers. With our world-class infrastructure for employees and people-centric policies, InfoVision is one of the highest-rated digital services companies in Glassdoor ratings. We encourage our employees to thrive in and are committed to providing a work environment that fosters an entrepreneurial mindset, nurtures inclusivity, values integrity and accelerates your career by creating opportunities for promising growth. Job Summary: We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities: Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.
Posted 2 months ago
3 - 6 years
7 - 11 Lacs
Hyderabad
Work from Office
Data Modeling Advisor Position Summary: The Health Services Data Design and Metadata Management team is hiring an Architecture Senior Advisor to work across all projects. The work involves understanding and driving data design best practices, including data modeling, mapping, and analysis, and helping others to apply them across strategic data assets. The data models are wide-ranging and must include the appropriate metadata to support and improve our data intelligence. Data design centers around standard health care data (eligibility, claim, clinical, and provider data) across structured and unstructured data platforms. Job Description & Responsibilities: Perform data analysis, data modeling, and data mapping following industry and Evernorth data design standards for analytics/data warehouses and operational data stores across various DBMS types, including Teradata, Oracle, Cloud, and Hadoop, Databricks and datalake. Perform data analysis, profiling and validation, contributing to data quality efforts to understand data characteristics and ensure data correctness/condition for use. Participate in and coordinate data model metadata development processes to support ongoing development efforts (data dictionary, NSM, and FET files), maintenance of data model/data mapping metadata, and linking of our data design metadata to business terms, data models, mapping documents, ETL jobs, and data model governance operations (policies, standards, best practices). Facilitate and actively participate in data model/data mapping reviews and audits, fostering collaborative working relationships and partnerships with multidisciplinary teams. Provide guidance, mentoring, and training as needed in data modeling, data lineage, ddl code, and the associated toolsets (Erwin Data Modeler, Erwin Web Portal, Erwin model mart, Erwin Data Intelligence Suite, Alation). Assist with the creation, documentation, and maintenance of Evernorth data design standards and best practices involving data modeling, data mapping, and metadata capture including data sensitivity, data quality rules, and reference data usage. Develop and facilitate strong partnerships and working relationships with Data Governance, delivery, and other data partners. Continuously improve operational processes for data design metadata management for global and strategic data. Interact with Business stakeholders and IT in defining and managing data design. Coordination, collaboration, and innovation with Solution Verticals, Data Lake teams, IT & Business Portfolios to ensure alignment of data design metadata and related information with ongoing programs (cyber risk and security) and development efforts. Experience Required: 11 to 13 years' experience with data modeling (logical / physical data model, canonical structures, etc.) and SQL code Experience Desired: Subject matter expertise level experience preferred Experience executing data model / data lineage governance across business and technical data. Experience utilizing data model / lineage / mapping / analysis management tools for business, technical and operational metadata (Erwin Data Modeler, Erwin Web Portal, Erwin Model Mart, Erwin Data Intelligence Suite, Alation Experience working in an Agile delivery environment (Jira, Confluence, SharePoint, Git, etc.) Education and Training Required: Advanced degree in Computer Science or a related discipline and at least six, typically eight or more years experience in all phases of data modeling, data warehousing, data mining, data entity analysis, logical data base design and relational data base definition, or an equivalent combination of education and work experience. Primary Skills: Physical Data Modeling, Data Warehousing, Metadata, Reference Data, Data Mapping Data Mining, Teradata, Data Quality, Excellent Communication Skills, Data Analysis, Oracle Data Governance, Database Management System, Jira, DDL, Data Integration, Microsoft, SharePoint, Database Modeling, Confluence, Agile, Marketing Analysis, SharePoint, Operations, Topo, Data Lineage, Data Warehouses, Documentation Big Data, Web Portal, Maintenance, Erwin, SQL, Unstructured Data, Audit, Git, Pharmacy DBMS, Databricks, AWS
Posted 2 months ago
6 - 11 years
25 - 35 Lacs
Pune, Delhi NCR, India
Hybrid
Exp - 6 to 10 Yrs Role - Data Modeller Position - Permanent Job Locations - DELHI/NCR (Remote), PUNE (Remote), CHENNAI (Hybrid), HYDERABAD (Hybrid), BANGALORE (Hybrid) Experience & Skills: 6+ years of experience in strong Data Modeling and Data Warehousing skills Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP)
Posted 2 months ago
10 - 18 years
30 - 45 Lacs
Pune, Delhi NCR, India
Hybrid
Exp - 10 to 18 Years Role - Data Modeling Architect / Senior Architect / Principal Architect Position - Permanent Locations - Hyderabad (Hybrid), Bangalore (Hybrid), Chennai (Hybrid), Pune (Remote till office opens), Delhi NCR (Remote till office opens) JD: 10+ years of experience in Data Warehousing & Data Modeling - Dimensional/Relational/Physical/Logical. Decent SQL knowledge Able to suggest modeling approaches & Solutioning for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Good experience in stakeholder management Decent communication and experience in leading the team
Posted 2 months ago
3 - 7 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Kafka Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Graduation Summary :As a Data Modeler, you will be responsible for working with key business representatives, data owners, end users, application designers, and data architects to model current and new data using Apache Kafka. Your typical day will involve designing and implementing data models, ensuring data quality and integrity, and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities: Design and implement data models using Apache Kafka, ensuring data quality and integrity. Collaborate with cross-functional teams, including key business representatives, data owners, end users, application designers, and data architects, to model current and new data. Develop and maintain data dictionaries, data flow diagrams, and other documentation related to data modeling. Ensure compliance with data security and privacy policies and regulations, including GDPR and CCPA. Professional & Technical Skills: Must To Have Skills:Experience with Apache Kafka. Good To Have Skills:Experience with other data modeling tools and technologies, such as ERwin or ER/Studio. Strong understanding of data modeling concepts and techniques, including conceptual, logical, and physical data models. Experience with data analysis and profiling tools, such as Talend or Informatica. Solid grasp of SQL and other database technologies, including Oracle, MySQL, and SQL Server. Additional Information: The candidate should have a minimum of 3 years of experience in Apache Kafka. The ideal candidate will possess a strong educational background in computer science, information systems, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Graduation
Posted 2 months ago
10 - 15 years
17 - 32 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Data Architect \ Data Modeler will be expected to leverage their knowledge of data modeling best practices along with cross industry data expertise and data modeling tool expertise. Looking for Early Joiners Demonstrable experience in developing, validating, publishing, maintaining LOGICAL data models with exposure to or experience in developing, validating, publishing, maintaining PHYSICAL data models Demonstrable experience using data modeling tools - e.g., ErWin Evaluate existing data models and physical databases for variances and discrepancies Experience with managing meta data for data models Demonstrable experience in developing, publishing, and maintaining all documentation for data models
Posted 2 months ago
4 - 9 years
6 - 11 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL
Posted 2 months ago
5 - 10 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Graduate Data Modelling: Collaborate with cross-functional teams to understand business requirements and translate them into effective and scalable data models Develop and maintain data models using industry-leading practices, with a strong emphasis on Data Mesh and Data Vault 2 methodologies Ensure that data models align with standards and guidelines defined by Data architects and are adaptable to the evolving needs of the business Responsible for the development of the conceptual, logical, and physical data models, the implementation of Data Mesh, Data Fabric on target platforms (Google Big Query) using ERWIN. Domain Expertise: Acquire a deep understanding of various business domains and their associated data, processes and systems, ensuring that data models are reflective of the domain-specific context and requirements Data Mesh Implementation: Work closely with the Data Mesh architecture principles to ensure the decentralised ownership and domain-oriented approach to data Define and implement data products, aligning with the Data Mesh principles of domain-driven decentralized data ownership Ensure that data is structured in order to easily conform to security controls and obligations, that relate to the data Data Vault 2 Implementation: Design and implement Data Vault 2.0 compliant data warehouses and hubs. Ensure that the Data Vault model provides flexibility, scalability, and resilience in handling complex and evolving business requirements. Ensure that every artifact built is optimised and monitored and that cost is always considered Support, guide and mentor team members, in the domain Collaboration: Prior experience working in an agile squad environment, with minimal supervision. Expert technical advice, presentations to and education of audiences (technical and business) within Enterprise Data and Architectures, and within the business, including data stewards and enterprise architects, regarding enterprise conformance and Data Vault modelling concepts Collaborate with solution architects, data engineers, data scientists, and other stakeholders to understand data usage patterns, deal with production and Data Quality issues and optimize data models for performance. Provide guidance and support to development teams in the implementation of data models within the Data Mesh and Data Vault 2 frameworks. Documentation: Create and maintain comprehensive documentation of data models, ensuring that they are accessible to relevant stakeholders. Keep abreast of industry trends, emerging technologies, and best practices related to data modelling and integration. Creation and maintenance of artefacts relating to data models (e.g. DDLs, mapping of data, DMLs, Data Dictionaried, Change Registers etc.)Other skills beneficial for the role are Certification in Data Vault 2.0 or related technologies Experience with tools such as Apache Kafka, Apache Flink, or similar data streaming platforms Familiarity with Google Cloud Platform services or AWS Platform Services with respect to Data and AI/ML Proficiency and experience with Erwin Data Modeller Experience or exposure to data catalogues such as Collibra and Abinitio would be highly beneficial Qualification Graduate
Posted 2 months ago
10 - 15 years
12 - 17 Lacs
Mumbai
Work from Office
About the company: With over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches, IndusInd Bank is a universal bank with a widespread banking footprint across the country. IndusInd offers a wide array of products and services for individuals and corporates including microfinance, personal loans, personal and commercial vehicles loans, credit cards, SME loans. Over the years, IndusInd has grown ceaselessly and dynamically, driven by zeal to offer our customers banking services at par with the highest quality standards in the industry. IndusInd is a pioneer in digital first solutions to bring together the power of next-gen digital product stack, customer excellence and trust of an established bank. Job Purpose: To work on implementing data modeling solutions To design data flow and structure to reduce data redundancy and improving data movement amongst systems defining a data lineage To work in the Azure Data Warehouse To work with large data volume of data integration Experience With overall experience between 10 to 15 years, applicant must have minimum 8 to 11 years of hard core professional experience in data modeling for large Data Warehouse with multiple Sources. Technical Skills Expertise in core skill of data modeling principles/methods including conceptual, logical & physical Data Models Ability to utilize BI tools like Power BI, Tableau, etc to represent insights Experience in translating/mapping relational data models into XML and Schemas Expert knowledge of metadata management, relational & data modeling tools like ER Studio, Erwin or others. Hands-on experience in relational, dimensional and/or analytical experience (using RDBMS, dimensional, NoSQL, ETL and data ingestion protocols). Very strong in SQL queries Expertise in performance tuning of SQL queries. Ability to analyse source system and create Source to Target mapping. Ability to understand the business use case and create data models or joined data in Datawarehouse. Preferred experience in banking domain and experience in building data models/marts for various banking functions. Good to have knowledge of Azure powershell scripting or Python scripting for data transformation in ADF SSIS, SSAS, BI tools like Power BI Azure PaaS components like Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Synapse (DWH), Polybase, ExpressRoute tunneling, etc. API integration Responsibility Understanding the existing data model, existing data warehouse design, functional domain subject areas of data, documenting the same with as is architecture and proposed one. Understanding existing ETL process, various sources and analyzing, documenting the best approach to design logical data model where required Work with development team to implement the proposed data model into physical data model, build data flows Work with development team to optimize the database structure with best practices applying optimization methods. Analyze, document and implement to re-use of data model for new initiatives. Will interact with stakeholder, Users, other IT teams to understand the eco system and analyze for solutions Work on user requirements and create queries for creating consumption views for users from the existing DW data. Will train and lead a small team of data engineers. Qualifications Bachelors of Computer Science or Equivalent Should have certification done on Data Modeling and Data Analyst. Good to have a certification of Azure Fundamental and Azure Engineer courses (AZ900 or DP200/201) Behavioral Competencies Should have excellent problem-solving and time management skills Strong analytical thinking skills Applicant should have excellent communication skill and process oriented with flexible execution mindset. Strategic Thinking with Research and Development mindset. Clear and demonstrative communication Efficiently identify and solves issues Identify, track and escalate risks in a timely manner Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on IndusInd Bank Candidates selected after the screening rounds will be processed further by IndusInd Bank
Posted 2 months ago
1 - 5 years
8 - 14 Lacs
Jaipur
Work from Office
Key Responsibilities: Design & Implement MDM Solutions: Develop and implement Master Data Management (MDM) strategies, ensuring clean, accurate, and consistent master data across the organization. Data Architecture & Modeling: Define data models, architecture, and integration patterns for MDM solutions, supporting data governance and business intelligence. Integration & Data Flow Management: Collaborate with IT and business teams to integrate MDM with enterprise systems (ERP, CRM, Data Warehouses). Governance & Compliance: Establish data governance policies, metadata management, data lineage, and ensure compliance with GDPR, CCPA, and other regulatory requirements. Performance Optimization: Optimize data architecture, ensuring high availability, scalability, and security of MDM solutions.
Posted 2 months ago
1 - 5 years
8 - 14 Lacs
Surat
Work from Office
Key Responsibilities: Design & Implement MDM Solutions: Develop and implement Master Data Management (MDM) strategies, ensuring clean, accurate, and consistent master data across the organization. Data Architecture & Modeling: Define data models, architecture, and integration patterns for MDM solutions, supporting data governance and business intelligence. Integration & Data Flow Management: Collaborate with IT and business teams to integrate MDM with enterprise systems (ERP, CRM, Data Warehouses). Governance & Compliance: Establish data governance policies, metadata management, data lineage, and ensure compliance with GDPR, CCPA, and other regulatory requirements. Performance Optimization: Optimize data architecture, ensuring high availability, scalability, and security of MDM solutions.
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Karnataka
Work from Office
Experience7-15 yrs Job LocationPan India(Except Gurgaon) Notice PeriodOnly Immediate Joiners JD: Dimensional Data Modeler who has worked on SAP Source systems and who is good with SQL. PL/SQL is a plus Hands-on in Data model tools like Erwin Data Modelers to work with database engineers to create optimal physical data models of datasets, then create and maintain data maps and systems interrelationship diagrams for data domains and systems To define and govern data modelling and design standards, tools, best practices and make recommendations for standardization and proper data usage
Posted 2 months ago
5 - 9 years
12 - 16 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Key Responsibilities: Develop APIs and microservices using Spring Boot. Implement integrations using APIGEE for API management. Work with Pivotal Cloud Foundry (PCF) and manage deployments. Leverage both AWS and Azure for cloud integration tasks. Create and manage data models using tools like Erwin, Vision, or Lucidchart. Required Skills: 5+ years of experience in integration development. Proficiency in Spring Boot and APIGEE. Expertise in Pivotal Cloud Foundry (PCF). Strong knowledge of AWS and Azure. Experience with data modeling tools (Erwin, Vision, Lucidchart). Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote
Posted 2 months ago
5 - 8 years
20 - 22 Lacs
Hyderabad
Work from Office
Experience in data modeling designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in Azure, Databricks, Data warehousing, ERWIN is required. SAP Knowledge is a must. Strong knowledge of data modeling principles and techniques (e.g., ERD, UML). Proficiency with data modeling tools (e.g., ER/Studio, ERwin, IBM Data Architect). Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Excellent analytical, problem-solving, and communication skills. Ability to work effectively in a collaborative, fast-paced environment.
Posted 2 months ago
3 - 5 years
5 - 7 Lacs
Chennai
Work from Office
Job Title:Data Modeler. Location:Chennai. Experience:8 10 Years. Primary Skills: data lake,SQL,Data Modeling,data migration,AZURE,AWS. Secondary Skills. About The Role ::. work from office only. Domain Experience:Banking. Roles And Responsibilities. Job Summary. As a Data Modeler, you will be responsible for designing, implementing, and optimizing data models that support the organizations information management and business intelligence needs. You will play a key role in transforming data into meaningful insights by designing data models that ensure efficient storage, retrieval, and integrity of data across various platforms. Key Responsibilities. Design and develop data models to support the organizations data and business intelligence requirements. Collaborate with data architects, data engineers, and stakeholders to ensure data model alignment with business requirements. Optimize and tune data models for performance and scalability. Ensure data accuracy, consistency, and integrity by implementing data quality and governance standards. Participate in data migration and integration projects, ensuring seamless data flow across systems. Qualifications. Bachelors or masters degree in computer science, Information Technology, or a related field. Skills. Technical Skills. Expertise in data modeling tools (e.g., ERwin, Power Designer, SQL Developer Data Modeler). Proficiency in SQL and database management systems (e.g., Oracle, SQL Server, Snowflake). Understanding of data warehousing and ETL processes. Familiarity with data governance and data quality management tools and practices. Soft Skills. Strong analytical and problem-solving skills. Effective communication and collaboration skills. Attention to detail and commitment to data accuracy. Good to Have. Experience with cloud platforms (e.g., AWS, Azure, GCP). Knowledge of big data technologies (e.g., Hadoop, Spark). Exposure to machine learning and advanced analytics. Work Experience. 5+ years of experience in data modeling, data architecture, or related roles. Compensation & Benefits. Competitive salary and annual performance-based bonuses. Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plan. Key Performance Indicator (KPIs). Accuracy and Efficiency of Data Models. Data Model Performance and Query Optimization. Compliance with Data Governance Standards. Time to Deliver Data Models for New Requirements. Stakeholder Satisfaction and Collaboration Effectiveness. Key Result Areas (KRAs). Data Model Design:Develop accurate and efficient data models that support business processes. Data Quality:Ensure data models meet high standards for data accuracy and quality. Collaboration:Work effectively with cross-functional teams to gather requirements and deliver solutions. Performance Optimization:Continuously improve data model performance for faster data retrieval. Documentation and Compliance:Maintain comprehensive documentation and adhere to data governance policies. Contact:hr@bigtappanalytics.com. Show more Show less
Posted 2 months ago
7 - 12 years
25 - 35 Lacs
Chennai, Hyderabad, Noida
Work from Office
Experience in design and optimize data models supporting automotive industry analytics and reporting. Working with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures.
Posted 3 months ago
5 - 8 years
12 - 22 Lacs
Chennai, Bengaluru, Gurgaon
Work from Office
Design& manage logical&physical data models using Erwin Data Modeler or any tool to ensure scalable & optimized database structures Automate database provisioning, patching,& maintenance workflows using Infrastructure as Code (I Required Candidate profile 5+ years of work exp in databse desiging /modelling Exp on AWS, PostgreSQL, MSsql, AWS aurora is must Database Performance & Query Optimization:
Posted 3 months ago
9 - 14 years
18 - 25 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Job Description : We are seeking a skilled Data Modeler to join our dynamic HR Data Team. The ideal candidate will have extensive experience in data modeling and database design, with a proficiency in data modeling tools such as Erwin Data Modeling. This role requires a strong understanding of data architecture principles, excellent analytical and problem-solving abilities, and effective communication skills. Key Responsibilities : - Collect data from multiple documentation sources and systems. - Create standardized data models for the HR Data Team using the Erwin Data Modeling Tool. - Develop Learn Data Models and Skills Data Models as MVP1, then expand to other HR functional areas. - Harmonize Data Models with Alation to ensure consistency and synergy. - Catalog and establish harmonization across data models to streamline processes. Job Requirements : - Proven experience in data modeling and database design. - Proficiency in data modeling tools, particularly Erwin Data Modeling. - Strong knowledge of SQL. - Deep understanding of data architecture principles. - Strong analytical and problem-solving skills. - Exceptional attention to detail. - Effective communication skills, capable of conveying complex technical information to both technical and non-technical stakeholders. Preferred Qualifications : - Experience in harmonizing data models with tools like Alation. - Familiarity with HR functional areas and related data models. What We Offer : - Competitive salary and benefits package. - Opportunity to work with a talented team of professionals. - A collaborative and innovative work environment. - Career growth and development opportunities. - If you are passionate about data modeling Location- Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad,Delhi NCR Exp.- 9-17 yrs
Posted 3 months ago
10 - 15 years
20 - 30 Lacs
Hyderabad
Work from Office
Experience needed: 12+ years Type: Full-Time Mode: 100% WFO (Monday to Friday) Shift: General Shift Location: Hyderabad, India Job Summary: We are seeking an experienced Data Modeller Lead to design, develop, and manage data models that support business intelligence, analytics, and data governance initiatives. The ideal candidate will have a strong background in data architecture, data modeling (conceptual, logical, and physical), and database design. They will collaborate with stakeholders across the organization to ensure data consistency, quality, and compliance with industry standards. Skills and Experience: 12+ years of experience in Data Analysis, Business Systems, or similar role Strong leadership qualities, project/task management skills, analytical skills, and work with distributed teams. Work with business stakeholders, SMEs, and technical teams to develop business & technical requirements. Experience in analyzing complex data systems and map them for Azure Data lake. Create data transformation and mapping documents collaborating with the client. Good understanding of DB principles and hands-on experience in advanced SQL to perform complex analysis Good understanding of data movement and optimization of ETL processing of vast datasets with specified SLAs. Expert knowledge of data modeling concepts Experience with data modeling, generate metadata to support relational & non-relational database implementations Experience building logical and physical data models. Documentation of data platforms for easy business consumption. Strong communication skills, both written and verbal, and skilled at adjusting communication style/vocabulary to an audience's role/function and technical familiarity Knowledge and understanding of financial services, preferably related to Equipment financing.
Posted 3 months ago
12 - 17 years
14 - 19 Lacs
Hyderabad
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : AWS Architecture Good to have skills : Database Architecture Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will be responsible for defining the data requirements and structure for the application. You will model and design the application data structure, storage, and integration. Your typical day will involve collaborating with teams to ensure data integrity, designing data models, and providing solutions to data-related problems across multiple teams. Roles & Responsibilities: Expected to be an SME in data architecture Collaborate and manage the team to perform effectively Responsible for team decisions and ensuring data integrity Engage with multiple teams and contribute to key decisions Expected to provide solutions to problems that apply across multiple teams Design and develop data models for efficient storage and retrieval Ensure data quality and integrity through data validation and cleansing Implement data integration strategies and solutions Professional & Technical Skills: Must To Have Skills:Proficiency in AWS Architecture Good To Have Skills:Experience with Database Architecture Strong understanding of data architecture principles and best practices Experience in designing and implementing scalable data solutions on AWS Knowledge of data modeling techniques and tools Familiarity with data governance and security Hands-on experience with data integration and ETL processes Ability to analyze complex data requirements and design appropriate solutions Additional Information: The candidate should have a minimum of 12 years of experience in AWS Architecture This position is based at our Hyderabad office A 15 years full-time education is required Qualifications 15 years full time education
Posted 3 months ago
7 - 12 years
30 - 45 Lacs
Pune, Bengaluru, Gurgaon
Work from Office
Responsibilities: Design conceptual, logical, and physical data models for a unified data platform. Define data product structures and ensure alignment with business needs. Collaborate with data engineers and architects to optimize database performance. Work on data governance, metadata management, and data lineage. Support testing and validation of data models with stakeholders. Required Skills: Expertise in data modeling tools (Erwin, ER/Studio, or equivalent). Strong understanding of relational and NoSQL databases. Experience with Azure based data solutions and ADLS 2.0. Knowledge of healthcare data and interoperability standards
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2