Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 9 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours’. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making Strong understanding of database design principles, data warehousing concepts, and data governance practices Excellent analytical and problem-solving skills with a keen attention to detail Preferred technical and professional experience Strong verbal and written communication skills, with the ability to explain complex concepts to non-technical stakeholders Ability to work collaboratively in a team environment and manage multiple projects simultaneously Knowledge of programming languages such as SQL
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
What you’ll do As a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Schema Design Developing conceptual, logical, and physical data models to support enterprise data requirements. Designing schema structures for Apache Iceberg tables on Cloudera Data Platform. Collaborating with ETL developers and data engineers to optimize data models for efficient ingestion and retrieval. Data Governance & Quality Assurance Ensuring data accuracy, consistency, and integrity across data models. Supporting data lineage and metadata management to enhance data traceability. Implementing naming conventions, data definitions, and standardization in collaboration with governance teams. ETL & Data Pipeline Support Assisting in the migration of data from IIAS to Cloudera Data Lake by designing efficient data structures. Working with Denodo for data virtualization, ensuring optimized data access across multiple sources. Collaborating with teams using Talend Data Quality (DQ) tools to ensure high-quality data in the models. Collaboration & Documentation Working closely with business analysts, architects, and reporting teams to understand data requirements. Maintaining data dictionaries, entity relationships, and technical documentation for data models. Supporting data visualization and analytics teams by designing reporting-friendly data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in data modeling, database design, and data engineering. Hands-on experience with ERwin Data Modeler for creating and managing data models. Strong knowledge of relational databases (PostgreSQL) and big data platforms (Cloudera, Apache Iceberg). Proficiency in SQL and NoSQL database concepts. Understanding of data governance, metadata management, and data security principles. Familiarity with ETL processes and data pipeline optimization. Strong analytical, problem-solving, and documentation skills. Preferred technical and professional experience Experience working on Cloudera migration projects. Exposure to Denodo for data virtualization and Talend DQ for data quality management. Knowledge of Kafka, Airflow, and PySpark for data processing. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. Certifications in Data Modeling, Cloudera Data Engineering, or IBM Data Solutions.
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
Who you areA highly skilled Data Engineer specializing in Data Modeling with experience in designing, implementing, and optimizing data structures that support the storage, retrieval and processing of data for large-scale enterprise environments. Having expertise in conceptual, logical, and physical data modeling, along with a deep understanding of ETL processes, data lake architectures, and modern data platforms. Proficient in ERwin, PostgreSQL, Apache Iceberg, Cloudera Data Platform, and Denodo. Possess ability to work with cross-functional teams, data architects, and business stakeholders ensures that data models align with enterprise data strategies and support analytical use cases effectively. What you’ll doAs a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Architecture Designing and developing conceptual, logical, and physical data models to support data migration from IIAS to Cloudera Data Lake. Creating and optimizing data models for structured, semi-structured, and unstructured data stored in Apache Iceberg tables on Cloudera. Establishing data lineage and metadata management for the new data platform. Implementing Denodo-based data virtualization models to ensure seamless data access across multiple sources. Data Governance & Quality Ensuring data integrity, consistency, and compliance with regulatory standards, including Banking/regulatory guidelines. Implementing Talend Data Quality (DQ) solutions to maintain high data accuracy. Defining and enforcing naming conventions, data definitions, and business rules for structured and semi-structured data. ETL & Data Pipeline Optimization Supporting the migration of ETL workflows from IBM DataStage to PySpark, ensuring models align with the new ingestion framework. Collaborating with data engineers to define schema evolution strategies for Iceberg tables. Ensuring performance optimization for large-scale data processing on Cloudera. Collaboration & Documentation Working closely with business analysts, architects, and developers to translate business requirements into scalable data models. Documenting data dictionary, entity relationships, and mapping specifications for data migration. Supporting reporting and analytics teams (Qlik Sense/Tableau) by providing well-structured data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4-7 years of experience in data modeling, database design, and data engineering. Hands-on experience with ERwin Data Modeler for creating and managing data models. Strong knowledge of relational databases (PostgreSQL) and big data platforms (Cloudera, Apache Iceberg). Proficiency in SQL and NoSQL database concepts. Understanding of data governance, metadata management, and data security principles. Familiarity with ETL processes and data pipeline optimization. Strong analytical, problem-solving, and documentation skills. Preferred technical and professional experience Experience working on Cloudera migration projects. Exposure to Denodo for data virtualization and Talend DQ for data quality management. Knowledge of Kafka, Airflow, and PySpark for data processing. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. Certifications in Data Modeling, Cloudera Data Engineering, or IBM Data Solutions.
Posted 1 month ago
2 - 5 years
7 - 11 Lacs
Mumbai
Work from Office
Who you areA highly skilled Data Engineer specializing in Data Modeling with experience in designing, implementing, and optimizing data structures that support the storage, retrieval and processing of data for large-scale enterprise environments. Having expertise in conceptual, logical, and physical data modeling, along with a deep understanding of ETL processes, data lake architectures, and modern data platforms. Proficient in ERwin, PostgreSQL, Apache Iceberg, Cloudera Data Platform, and Denodo. Possess ability to work with cross-functional teams, data architects, and business stakeholders ensures that data models align with enterprise data strategies and support analytical use cases effectively. What you’ll doAs a Data Engineer – Data Modeling, you will be responsible for: Data Modeling & Architecture Designing and developing conceptual, logical, and physical data models to support data migration from IIAS to Cloudera Data Lake. Creating and optimizing data models for structured, semi-structured, and unstructured data stored in Apache Iceberg tables on Cloudera. Establishing data lineage and metadata management for the new data platform. Implementing Denodo-based data virtualization models to ensure seamless data access across multiple sources. Data Governance & Quality Ensuring data integrity, consistency, and compliance with regulatory standards, including Banking/regulatory guidelines. Implementing Talend Data Quality (DQ) solutions to maintain high data accuracy. Defining and enforcing naming conventions, data definitions, and business rules for structured and semi-structured data. ETL & Data Pipeline Optimization Supporting the migration of ETL workflows from IBM DataStage to PySpark, ensuring models align with the new ingestion framework. Collaborating with data engineers to define schema evolution strategies for Iceberg tables. Ensuring performance optimization for large-scale data processing on Cloudera. Collaboration & Documentation Working closely with business analysts, architects, and developers to translate business requirements into scalable data models. Documenting data dictionary, entity relationships, and mapping specifications for data migration. Supporting reporting and analytics teams (Qlik Sense/Tableau) by providing well-structured data models. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Preferred technical and professional experience Experience in Cloudera migration projects in the banking or financial sector. Knowledge of PySpark, Kafka, Airflow, and cloud-native data processing. Experience with Talend DQ for data quality monitoring. Familiarity with graph databases (DGraph Enterprise) for data relationships. Experience with GitLab, Sonatype Nexus, and CheckMarx for CI/CD and security compliance. IBM, Cloudera, or AWS/GCP certifications in Data Engineering or Data Modeling.
Posted 1 month ago
4 - 6 years
12 - 15 Lacs
Hyderabad
Remote
Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities: 1. Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. 2. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. 3. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. 4. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. 5. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. 6. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. 7. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. 8. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Chennai
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Kolkata
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
12 - 22 years
35 - 60 Lacs
Noida
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
5 - 10 years
14 - 24 Lacs
Bengaluru
Work from Office
Design& manage data using Erwin Data modeller Database Performance & Query Optimization: HA/DR, implement AWS aurora, Database automation, Integrate real-time monitoring with AWS CloudWatch, Prometheus, and Grafana dashboards. Required Candidate profile 6-8 years work ex in Database administrator, Bachelor/ master in CS, expertise in postgreSQL, Oracle DB , AWS RDS,AWS aurora, AWSS multi AZ, prometheus,graffana
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours’. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making Strong understanding of database design principles, data warehousing concepts, and data governance practices Excellent analytical and problem-solving skills with a keen attention to detail Preferred technical and professional experience Strong verbal and written communication skills, with the ability to explain complex concepts to non-technical stakeholders Ability to work collaboratively in a team environment and manage multiple projects simultaneously Knowledge of programming languages such as SQL
Posted 1 month ago
6 - 10 years
20 - 32 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required: Proficient in Data Modelling 6-8 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only
Posted 2 months ago
5 - 8 years
25 - 35 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Atleast 2 Projects done as Data Modeler Discover, analyze, and scope data requirements ; create high level process models to represent processes for the area under analysis Analyze existing business system data, data profiling, and source analysis Create detailed mapping documents covering source to target mapping, transformation logic and harmonization Design Data Architecture, Strategy, Employ Entity-Relationship modeling with the intent of producing a design best suited for retrospective analysis of data. Determine the right set of facts and surrounding dimensions suited to perform the desired type of analysis or queries against the data. Arrive at data solution diagrams including data flow diagrams, use case models, etc. Skilled in analyzing existing business system data, data profiling and source analysis. Expert using SQL Experts in understanding business need to translate those needs into an effective model. Expert in Entity-Relationship modeling, with specific expertise in Star/Snowflake Schema modeling. Experience implementing IBM UDMH Model - Atomic or Dimensional Model Exposure in one or more enterprise-level Modeling tools, IDA, ERWin, ER-Studio, etc Healthcare Domain Knowledge is a must; Exposure to HL7, FHIR, OMOP will be a plus Advanced understanding of data normalization and denormalization techniques. Expert in Metadata management and tools. Advanced experience in Master Data management/systems and Reference Data management/systems. Advanced experience with modeling process creation and implementation. Experts in understanding business need to translate those needs into an effective model. Location - Remote, Kolkatta, Delhi, NCR, Ahmedabd, Mumbai,Pune,Noida.
Posted 2 months ago
2 - 5 years
4 - 7 Lacs
Kochi
Work from Office
Translates business needs into a data model, providing expertise on data modeling tools and techniques for designing data models for applications and related systems. Skills include logical and physical data modeling, and knowledge of ERWin, MDM, and/or ETL. Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Translates business needs into a data model, providing expertise on data modeling tools and techniques for designing data models for applications and related systems. Skills include logical and physical data modeling, and knowledge of ERWin, MDM, and/or ETL. Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations. Therefore, the process of data modeling involves professional data modelers working closely with business stakeholders, as well as potential users of the information system. There are three different types of data models produced while progressing from requirements to the actual database to be used for the information system Preferred technical and professional experience Translates business needs into a data model, providing expertise on data modeling tools and techniques for designing data models for applications and related systems. Skills include logical and physical data modeling, and knowledge of ERWin, MDM, and/or ETL. Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations
Posted 2 months ago
4 - 9 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours’. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making Strong understanding of database design principles, data warehousing concepts, and data governance practices Excellent analytical and problem-solving skills with a keen attention to detail Preferred technical and professional experience Strong verbal and written communication skills, with the ability to explain complex concepts to non-technical stakeholders Ability to work collaboratively in a team environment and manage multiple projects simultaneously Knowledge of programming languages such as SQL
Posted 2 months ago
5 - 8 years
30 - 35 Lacs
Chennai, Bengaluru, Hyderabad
Work from Office
Atleast 2 Projects done as Data Modeler Discover, analyze, and scope data requirements ; create high level process models to represent processes for the area under analysis Analyze existing business system data, data profiling, and source analysis Create detailed mapping documents covering source to target mapping, transformation logic and harmonization Design Data Architecture, Strategy, Employ Entity-Relationship modeling with the intent of producing a design best suited for retrospective analysis of data. Determine the right set of facts and surrounding dimensions suited to perform the desired type of analysis or queries against the data. Arrive at data solution diagrams including data flow diagrams, use case models, etc. Skilled in analyzing existing business system data, data profiling and source analysis. Expert using SQL Experts in understanding business need to translate those needs into an effective model. Expert in Entity-Relationship modeling, with specific expertise in Star/Snowflake Schema modeling. Experience implementing IBM UDMH Model - Atomic or Dimensional Model Exposure in one or more enterprise-level Modeling tools, IDA, ERWin, ER-Studio, etc Healthcare Domain Knowledge is a must; Exposure to HL7, FHIR, OMOP will be a plus Advanced understanding of data normalization and denormalization techniques. Expert in Metadata management and tools. Advanced experience in Master Data management/systems and Reference Data management/systems. Advanced experience with modeling process creation and implementation. Experts in understanding business need to translate those needs into an effective model Location-Chennai,Hyderabad,Bengaluru,Noida,Gurugram,Pune,Ahmedabad,Mumbai,Remote, Delhi, NCR, Kolkatta, WFH, "work from home"
Posted 2 months ago
16 - 22 years
40 - 55 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.
Posted 2 months ago
7 - 12 years
20 - 30 Lacs
Pune, Bengaluru, Hyderabad
Hybrid
Job Description (Senior) Data Modeler The data modeler designs, implements, and documents data architectures and data models for solutions, which include the use of application, relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, operations, machine learning, data science, and other business interests. The successful candidate will: 1. Be responsible for the development of the conceptual, logical, and physical data models, oversight of the implementation of RDBMS, operational data stores (ODS), application databases, data marts, and data lakes on target platforms (SQL/NoSQL/cloud/on-prem). 2. Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (application, relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, operations and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modelling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualisations. Hands-on modelling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills Strong communication skill, good oral English 3+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience on MongoDB would be an advantage. Good knowledge of metadata management, data modelling, and related tools (Erwin or Visual Paradigm or others) required. Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-centre contexts is desired. Experience in team management, communication, and presentation.
Posted 2 months ago
7 - 12 years
9 - 14 Lacs
Gurgaon
Work from Office
Project Role : Data Modeler Project Role Description :Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills :Data Modeling Techniques and Methodologies Good to have skills :NA Minimum 7.5 year(s) of experience is required Educational Qualification :15 years full time educationSummary:As a Data Modeler, you will collaborate with key stakeholders, including business representatives, data owners, and architects to model current and new data, contributing to data architecture decisions and solutions. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data modeling efforts for various projects.- Develop and maintain data models for databases.- Ensure data integrity and quality in all data modeling activities.- Provide guidance and support to junior data modelers. Professional & Technical Skills:- Must To Have Skills:Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database management systems.- Experience with data modeling tools such as ERwin or PowerDesigner.- Knowledge of data normalization and denormalization techniques.- Hands-on experience with SQL and data querying.- Good To Have Skills:Experience with data governance practices. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
5 - 10 years
20 - 25 Lacs
Pune, Hyderabad
Hybrid
Job Title: Databricks Data Modeler Location : Pune / Hyderabad Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.
Posted 2 months ago
8 - 13 years
12 - 17 Lacs
Mysore
Hybrid
Data Modeler, ERWIN, Informatica , SQL Should be aware of Data modeling Concepts Should have exposure to ERWIN tool Good to have knowledge on SQL and Unix scripts
Posted 2 months ago
7 - 12 years
10 - 18 Lacs
Bengaluru
Work from Office
Partner with business analysts, product managers, and business users to gather requirements and translate them into scalable data solutions.
Posted 2 months ago
12 - 15 years
25 - 35 Lacs
Chennai, Pune, Bengaluru
Hybrid
Good to have hands-on experience on data modelling tools such as Erwin etc. 3. Should have good knowledge on provider domain 4. Should have strong experience on Snowflake and must have executed development and migration projects involving snowflake.
Posted 2 months ago
5 - 8 years
15 - 27 Lacs
Jaipur
Hybrid
As a Senior Consultant in our Cloud Engineering Team youll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Your main responsibilities will be: Work on Design and development of new database /data model changes. Work on Database deployments and upgrade procedures. Develop and document best practices for database Installations, Administration, Troubleshooting and performance tuning. Collaborate with Change Management, Quality Assurance, Technical Documentation and Project Management for new release and hotfixes. Provide technical support triaging issues, identifying root cause and deploy fix and or workarounds. Peer review and verify code for other database developers. Partner with developers to review and data model changes and provide recommendations based on best practices. Improve existing processes and provide innovative ways to automate routines. Maintain and Update Database Dictionary and release artifacts with each release. Desired qualifications 5+ years’ experience in data modeling, ERD using ERWIN or other data modeling tools. 5+ years’ experience working with high volume OLTP and OLAP systems. Excellent T-SQL, PL/SQL script writing debugging skills. Experience with database upgrade tools, deployments and automation. Experience troubleshooting issues with database performance, functionality. Ability to work with source controls like git and svn, project tracking with JIRA. Ability to engage with emotional intelligence as required to avoid conflicts. High energy professional who thrives working in a fast-paced environment. Strong Team Player with Customer Focus and attitude. Bachelor’s degree in Engineering or Computer Science. Location and way of working. Work location: Jaipur This profile does not involve extensive travel for work. Hybrid is our default way of working. Each domain has customised the hybrid approach to their unique needs.
Posted 2 months ago
6 - 11 years
15 - 20 Lacs
Mumbai
Work from Office
1.Architecture ReviewConduct comprehensive assessments of existing Data, AI, and Automation architecture landscapes across various Financial Institutions and Regulators. Identify gaps compared to leading architecture patterns and best practices. 2.Gap Analysis and Pitch DevelopmentConstruct detailed pitches for transformation programs aimed at bridging identified gaps. Articulate the value proposition of proposed solutions to stakeholders. 3.Technical Solution ArchitectureLead the formulation of technical solution architectures for proposals and pitches. Ensure that solutions are innovative, scalable, and aligned with industry standards, while also establishing differentiators against competitors. 4.Technology Stack EvaluationEvaluate current technology stacks used by banks and assist in selecting appropriate products and partner ecosystems that enhance the overall architecture. 5.Execution OversightOversee the implementation of solution architectures during the execution phase. Review progress against architectural designs and ensure adherence to established guidelines and standards. 6.Stakeholder CollaborationCollaborate with cross-functional teams, including business analysts, developers, and project managers, to ensure alignment between business needs and technical solutions. 7.Documentation and ReportingMaintain clear documentation of architectural designs, decisions made, and execution progress. Provide regular updates to stakeholders on project status and any challenges encountered. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 1.Educational BackgroundA bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field is required. 2.ExperienceAt least 15 years of experience in IT consulting or a related field with a strong focus on conceptualizing solution architecture in the banking sector. 3. Banking KnowledgeExpertise in leading or crafting analytical solutions or products within the banking sector. 4. Skills: a.Proficiency in designing scalable architectures that leverage Data, AI, and automation technologies. b.Strong understanding of cloud computing platforms (e.g., AWS, Azure, GCP ) and their application in banking solutions. c.Experience with various architectural scenarios in Banking i.Very Large-Scale Data Management and high performing solution architectures. ii.Low latency – Near Real time and Real Time Processing iii.High reliability d.Familiarity with programming languages, API libraries and communication protocols in Banking. 5.Professional Skills: a.Excellent skills with a strong ability to identify gaps in existing architectures. b.Strong communication skills to effectively convey complex technical concepts to non-technical stakeholders. c.Capabilities to assess, guide and course correct engagements execution decisions pertaining to solution architecture d.Understanding for regulatory guidelines on banking system interoperability and security. Preferred technical and professional experience 1.Familiarity with emerging trends in banking such as digital banking, embedded banking, and regulatory compliance requirements. 2.CertificationsRelevant certifications such as TOGAF (The Open Group Architecture Framework), AWS Certified Solutions Architect, or similar credentials that demonstrate expertise in architecture design. 3.Experience with Analytical SolutionsPrior experience in leading analytical solutions or products within the banking industry is highly desirable. 4.Understanding of Security PrinciplesKnowledge of security frameworks relevant to banking applications to ensure compliance with regulatory standards.
Posted 2 months ago
4 - 9 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours’. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making Strong understanding of database design principles, data warehousing concepts, and data governance practices Excellent analytical and problem-solving skills with a keen attention to detail Preferred technical and professional experience Strong verbal and written communication skills, with the ability to explain complex concepts to non-technical stakeholders Ability to work collaboratively in a team environment and manage multiple projects simultaneously Knowledge of programming languages such as SQL
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2