Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 12.0 years
15 - 30 Lacs
Pune, Chennai
Work from Office
Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.
Posted 4 days ago
6.0 - 10.0 years
20 - 30 Lacs
Pune
Hybrid
Experience Role purpose: Strong understanding of end-to-end impact assessment across all subject areas. Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. Preferred candidate profile :- Data Modeller, Data Architeture Architecture Experance Ranges :- 6+ Years Location :- Pune(Hybrid)
Posted 1 week ago
9.0 - 14.0 years
20 - 32 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Data Architect/Data Modeler role: Scope & Responsibilities: Enterprise data architecture for a functional domain or for a product group. Design and governs delivery of the Domain Data Architecture / ensures delivery as per design. Ensures consistency in approach for the data modelling of the different solutions of the domain. Design and ensure delivery of data integration across the solutions of the domain. General Expertise: Critical: Methodology expertise on data architecture and modeling from business requirements and functional specifications to data modeling. Critical: Data warehousing and Business intelligence data products modeling (Inmon/Kimball/Data Vault/Codd modeling patterns). Business/Functional knowledge of the domain. This requires business terminology understanding, knowledge of business processes related to the domain, awareness of key principles and objectives, business trends and evolution. Master data management / data management and stewardship processes awareness. Data persistency technologies knowledge: SQL (Ansi-2003 for structured relational data querying & Ansi-2023 for XML, JSON, Property Graph querying) / Snowflake specificities, database structures for performance optimization. NoSQL – Other data persistency technologies awareness. Proficient level of Business English & “Technical writing” Nice to have: Project delivery expertise through agile approach and methodologies experience/knowledge: Scrum, SAFe 5.0, Product-based-organization. Technical Stack expertise: SAP Power Designer modeling (CDM, LDM, PDM) Snowflake General Concepts and specifically DDL & DML, Snow Sight, Data Exchange/Data Sharing concepts AWS S3 & Athena (as a query user) Confluence & Jira (as a contributing user) Nice to have: Bitbucket (as a basic user)
Posted 2 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
Pune, Chennai, Bengaluru
Hybrid
Roles and Responsibilities: Review, refine, and maintain data models for enterprise applications. Add or remove fields in data structures ensuring forward compatibility and minimal disruption to downstream systems. Collaborate with data architects, engineers, and business analysts to gather requirements and translate them into effective data designs. Ensure consistency, accuracy, and integrity across all data models and documentation. Communicate effectively with business stakeholders to understand requirements and present data modeling concepts. Maintain data model documentation using modeling tools like ER/Studio. Provide recommendations on data modeling best practices and standards. Support integration with SAP data models and APIs where applicable. Work with Azure data services such as Azure Data Lake, Azure Synapse, etc. Must-Have Skills: Proven experience in data modeling , including creating and modifying data structures. Strong understanding of forward compatibility and version control for data changes. Excellent communication skills and the ability to engage with business stakeholders. Basic working knowledge of Azure (e.g., Azure Data Lake, Azure SQL, Synapse). Solid understanding of relational databases and enterprise data architecture principles. Good-to-Have Skills: Experience with ER/Studio or similar data modeling tools (ERwin, PowerDesigner). Exposure to or experience with SAP data models and API design/integration . Understanding of data governance and metadata management . Familiarity with Agile methodology and tools like JIRA or Confluence. Skills Data Modeller,Api,Azure,Communication
Posted 3 weeks ago
6 - 10 years
20 - 30 Lacs
Pune
Hybrid
Experience Role purpose: Strong understanding of end-to-end impact assessment across all subject areas. Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. Preferred candidate profile :- Data Modeller, Data Architeture Architecture Experance Ranges :- 6+ Years Location :- Pune(Hybrid)
Posted 1 month ago
6 - 10 years
20 - 32 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required: Proficient in Data Modelling 6-8 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only
Posted 2 months ago
8 - 13 years
12 - 17 Lacs
Mysore
Hybrid
Data Modeler, ERWIN, Informatica , SQL Should be aware of Data modeling Concepts Should have exposure to ERWIN tool Good to have knowledge on SQL and Unix scripts
Posted 2 months ago
6 - 10 years
20 - 30 Lacs
Pune
Hybrid
le & responsibilities :- Role purpose: Strong understanding of end-to-end impact assessment across all subject areas. Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. Preferred candidate profile :- Data Modeller, Data ArchitetureArchitecture ,Teradata SQL Perks and benefits :- 25 LPA to 30 LPA Experance Ranges :- 6+ Years Location :- Pune(Hybrid)
Posted 2 months ago
2 - 6 years
4 - 8 Lacs
Kolkata
Work from Office
Inspira Enterprise India Pvt. Ltd. is looking for Data Modeller to join our dynamic team and embark on a rewarding career journey. As a Data Modeler, you will be responsible for designing and implementing data models, ensuring the integrity and performance of databases, and collaborating with other teams to understand data requirements. Your role is pivotal in creating efficient and effective data solutions that align with business objectives. Key Responsibilities : Data Modeling : Develop, design, and maintain conceptual, logical, and physical data models based on business requirements. Ensure that data models are scalable, flexible, and support future business needs. Database Design : Collaborate with database administrators and developers to implement and optimize database structures. Design and implement indexing strategies to improve database performance. Requirements Analysis : Work closely with business analysts and stakeholders to understand data requirements and translate them into data models. Documentation : Create and maintain comprehensive documentation for data models, ensuring clarity and accessibility for other team members. Data Governance : Implement and enforce data governance policies and best practices to ensure data quality and consistency. Collaborate with data stewards to define and manage data standards. Data Integration : Collaborate with ETL (Extract, Transform, Load) developers to ensure smooth data integration processes. Design and optimize data integration workflows. Data Quality Assurance : Implement data quality checks and validation processes to ensure the accuracy and reliability of data. Collaboration : Work closely with cross-functional teams, including business analysts, data scientists, and software developers, to ensure seamless integration of data models into applications.
Posted 2 months ago
11 - 14 years
13 - 16 Lacs
Bengaluru
Work from Office
Knowledge of Oracle+ LDAP+ Cassandra+ MongoDB+ and the Data flow components outlined in the target architecture is needed. The Architect/MongoDB Data Modeler will take end-to-end ownership of the data model
Posted 3 months ago
5 - 8 years
5 - 11 Lacs
Mumbai
Work from Office
Role & responsibilities Denodo Lead drive sematic layer build for each data product on Denodo this is a combination of Data Modeler and BA role identify the data elements , CDEs, transformations, application of user access / security for each data elements to build the unified semantic layer and views on top of it Denodo Developers will follow the best practices laid out by the Denodo Lead build virtualization layers based on requirements / Data Products / Transformations gathered during requirement phase.
Posted 3 months ago
8 - 12 years
10 - 15 Lacs
Mumbai
Hybrid
Role & responsibilities 8+ years experiences with data analytics, data modeling or data architecture and database design. Experience in data modeling or data architecture in transactional and operational reporting and analytical (EDW, Data Lake, NoSQL) solutions. Experience with Dimensional Modeling / Data Warehouse Modeling Data modeling, data warehousing, dimensional modeling, data modeling for big data and metadata management. Experience using (Erwin, TOAD, or any other data modeling tool) for data modeling. Understanding of enterprise and reporting modeling concepts, including dimensional modeling, snowflakes, slowly changing dimensions, schema on read, irregular dimensions, and surrogate, compound and intelligent keys. Experience with capacity planning, database scripting and package deployment Good knowledge of data replication methodology. Good knowledge of data warehouse, data mart and Data Lake Experience with AWS and cloud-based databases and data warehouses. Strong Knowledge of Structured Query Language (SQL) and use in data access and analysis. Excellent communication, problem solving, organizational and analytical skills Strong communication and interpersonal skills.
Posted 3 months ago
5 - 8 years
5 - 11 Lacs
Bengaluru
Work from Office
Role & responsibilities Denodo Lead drive sematic layer build for each data product on Denodo this is a combination of Data Modeler and BA role identify the data elements , CDEs, transformations, application of user access / security for each data elements to build the unified semantic layer and views on top of it Denodo Developers will follow the best practices laid out by the Denodo Lead build virtualization layers based on requirements / Data Products / Transformations gathered during requirement phase.
Posted 3 months ago
5 - 8 years
5 - 11 Lacs
Hyderabad
Work from Office
Role & responsibilities Denodo Lead drive sematic layer build for each data product on Denodo this is a combination of Data Modeler and BA role identify the data elements , CDEs, transformations, application of user access / security for each data elements to build the unified semantic layer and views on top of it Denodo Developers will follow the best practices laid out by the Denodo Lead build virtualization layers based on requirements / Data Products / Transformations gathered during requirement phase.
Posted 3 months ago
8 - 12 years
10 - 15 Lacs
Hyderabad
Hybrid
Role & responsibilities 8+ years experiences with data analytics, data modeling or data architecture and database design. Experience in data modeling or data architecture in transactional and operational reporting and analytical (EDW, Data Lake, NoSQL) solutions. Experience with Dimensional Modeling / Data Warehouse Modeling Data modeling, data warehousing, dimensional modeling, data modeling for big data and metadata management. Experience using (Erwin, TOAD, or any other data modeling tool) for data modeling. Understanding of enterprise and reporting modeling concepts, including dimensional modeling, snowflakes, slowly changing dimensions, schema on read, irregular dimensions, and surrogate, compound and intelligent keys. Experience with capacity planning, database scripting and package deployment Good knowledge of data replication methodology. Good knowledge of data warehouse, data mart and Data Lake Experience with AWS and cloud-based databases and data warehouses. Strong Knowledge of Structured Query Language (SQL) and use in data access and analysis. Excellent communication, problem solving, organizational and analytical skills Strong communication and interpersonal skills.
Posted 3 months ago
7 - 12 years
14 - 24 Lacs
Hyderabad
Hybrid
Job Location : Hyderabad Mode of Work: Hybrid Key Skills to work o n: Data Modelling, Financial Data, Data Modelling tools + Financial Data Modelling Domain : Banking/BFSI only Principal responsibilities The jobholder will continually reassess the operational risks associated with the role and inherent in the business, taking account of changing legal and regulatory requirements, operating procedures and practices, management restructurings, and the impact of innovative technology. This will be achieved by ensuring all actions take into account the likelihood of operational risk events, and by addressing any areas of concern in conjunction with line management and/or the appropriate department. The role will implement the Operational Risk control framework and per the BRCMs Three Lines of Defence.” Qualifications - Minimum of 5 years' experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. Good understanding of managing 'data as a product (asset)’ principle across enterprise domains and technology landscapes & architectural domains (business, data, application, and technology) Experience of working with Agile and Scrum in a large scalable Agile environment. This should include participation and progress reporting in daily standups. Data standards, data governance, data strategy and data lineage would be advantageous in this role. Cloud exposure to solutions implemented in either GCP, AWS or Azure would be beneficial as well as having exposure to big data solutions would be advantageous. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc. Knowledge of reference/master data management, data modelling standards and modelling technical documentation using Entity Relationship Diagrams (ERD) or Unified Modelling language (UML) or BIAN. Interested can share profile to gramashetty@allegisglobalsolutions.com Regards, Gopala BR HR Talent Acquisition
Posted 3 months ago
6 - 9 years
15 - 18 Lacs
Chennai, Bengaluru
Work from Office
Job Title: Data Engineer Location: Bangalore, Karnataka Experience: 6 to 9 years Salary: 17.50 LPA Employment Type: Full-time Skills: Data modeling, SQL, Snowflake, Python, AWS, NOSQL Good to have NO SQL data modelling Job Summary: As a Data Engineer, you will design, develop, and manage data pipelines that support analytics and business intelligence (BI) initiatives. You will be responsible for ensuring data quality, efficiency, and scalability in a cloud-native environment. You will collaborate with cross-functional teams including Data Scientists, BI Analysts, and Software Engineers to optimize data flows and implement effective data solutions. Our team is growing, and we are looking for a Data Engineer with experience in SQL, Snowflake, Python, AWS, and NoSQL databases to help us scale our data infrastructure and support business intelligence initiatives. Key Responsibilities: Design and implement efficient, scalable data models in Snowflake and other cloud data platforms. Build, maintain, and optimize data pipelines to support data ingestion, transformation, and storage. Develop and manage ETL processes using Python, SQL, and Snowflake. Collaborate with stakeholders to identify data requirements and create data solutions. Work with NoSQL and SQL databases (such as MongoDB, Cassandra, PostgreSQL) for handling diverse data types. Deploy and manage data workflows on AWS, utilizing services such as S3, Redshift, and Lambda. Perform data validation and ensure data integrity, consistency, and accuracy. Assist in troubleshooting and resolving data-related issues. Provide insights and recommendations for process optimization and automation. Skills and Qualifications: 6 to 9 years of experience in Data Engineering or similar roles. Strong hands-on experience with Snowflake, SQL, Python, and AWS services (Redshift, S3, Lambda). Expertise in designing data models for both structured and unstructured data. Experience with NoSQL databases (MongoDB, Cassandra, etc.) is a plus. Strong knowledge of data warehousing concepts and data integration patterns. Hands-on experience in building scalable data pipelines using ETL frameworks. Proficiency in data pipeline orchestration tools like Airflow is a plus. Strong problem-solving skills and the ability to work independently and as part of a team. Good communication skills to work with cross-functional teams.
Posted 3 months ago
3 - 5 years
6 - 8 Lacs
Mumbai
Work from Office
The role plays a key role in designing and maintaining data models that align with the organization's business requirements, enhance data quality, and facilitate efficient data processing and analysis. Job Description - Grade Specific The role is focused on leading and managing data modeling activities within the organization, driving strategic initiatives, ensuring quality and consistency in data models, and collaborating with stakeholders to support organizational goals.
Posted 3 months ago
5 - 10 years
12 - 22 Lacs
Bengaluru
Work from Office
Design and optimize PostgreSQL and BigQuery databases. Maintain and enhance data models for new and existing implementations. Apply FIMR and Accord enterprise data models for insurance and wellness.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2