Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 9.0 years
5 - 9 Lacs
Visakhapatnam
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Nashik
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Pune
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Ludhiana
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
7.0 - 9.0 years
5 - 9 Lacs
Lucknow
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.
Posted 1 week ago
8.0 - 12.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 week ago
8.0 - 12.0 years
32 - 37 Lacs
Hyderabad
Work from Office
Job Overview: As Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operatinghighly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 week ago
5.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_1769_JOB Date Opened 23/03/2023 Industry Technology Job Type Work Experience 5-8 years Job Title IBM DATASTAGE City Pune Province Maharashtra Country India Postal Code 411013 Number of Positions 2 5 years of DataStage experience. Strong Data warehousing knowledge and ready to provide guidance to junior members in the team. Must have a good communication skill as the role demands lot of interaction with US Business team as well as IT stakeholders. Must be able to work independently handle multiple concurrent tasks with an ability to prioritize and manage tasks effectively. Experience in developing DataStage jobs and deploying the jobs thru SDLC cycle. Knowledge of data modeling database design and the data warehousing ecosystem. Ability to work independently and collaborate with others at all levels of technical understanding. Analyzing organizational data requirements and reviewing/Understanding logical and physical Data Flow Diagrams and Entity Relationship Diagrams using tools such as Visio and Erwin Designing and building scalable DataStage solutions. Updating data within repositories and data warehouses. Assisting project leaders in determining project timelines and objectives. Monitoring jobs and identifying bottlenecks in the data processing pipeline. Testing and troubleshooting problems in system designs and processes. Proficiency in SQL or another relevant coding language. Has great communication and reasoning skills including the ability to make a strong case for technology choices. 5+ years of experience in testing debugging skills and troubleshooting support and development issues check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
6.0 - 10.0 years
3 - 6 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_2412_JOB Date Opened 04/02/2025 Industry IT Services Job Type Work Experience 6-10 years Job Title Data Modeller City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Skill Set Required GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, BigQuery Data Modeller - Hands-on data modelling for OLTP and OLAP systems. In-Depth knowledge of Conceptual, Logical and Physical data modelling. Strong understanding of Indexing, partitioning, data sharding with practical experience of having done the same. Strong understanding of variables impacting database performance for near-real time reporting and application interaction. Should have working experience on at least one data modelling tool, preferably DBSchema. People with functional knowledge of the mutual fund industry will be a plus. Good understanding of GCP databases like AlloyDB, CloudSQL and BigQuery. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
8.0 - 12.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
3.0 - 5.0 years
12 - 16 Lacs
Chennai
Work from Office
G6 Cloud Data Architect (Snowflake + DBT), this is for pro-active hiring
Posted 1 week ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 week ago
5.0 - 7.0 years
6 - 10 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
More than 5+ years of experience in data modelling – designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. Must Have: Data Modeling, Data Modeling Tool experience, SQL Nice to Have: SAP HANA, Data Warehouse, Databricks, CPG"
Posted 1 week ago
6.0 - 9.0 years
10 - 20 Lacs
Hyderabad
Hybrid
Role & responsibilities Preferred candidate profile Job Title: Data Modeller Location : Hyderabad (Hybrid) Criteria Proficient in relational data modeling and business analysis with metadata management experience. Skilled in developing conceptual, logical, and physical data models for varied environments. Familiarity with data modeling tools like ER Studio, EA Sparx, and Collibra. Experience in creating and governing data models, including financial and reference models. Ability to coordinate and resolve model discrepancies across platforms with a focus on integration. Relevant Experience: 6-9 year Technical Requirements: Experience in Relational/Normalized Data Modelling Strong Business Analysis skills Proficiency in Data Analysis Ability to develop conceptual, logical, and physical data models to support business requirements Experience in developing enterprise data models for OLTP, Analytics/AI/ML environments Familiarity with data modelling tools such as ER Studio, EA Sparx, Collibra Knowledge of metadata modelling, management, and tooling Implementation experience of physical models for interfaces, data stores, and cloud-based solutions Participation in governing data models Familiarity with Finance and Reference Data Models such as BIAN, ISO20022, FIBO, FSLDM Functional Requirements: Analyzing Data Requirements Creating/Modifying Entities and Attributes in BDM Identifying Reference Domains Peer reviewing BDM changes made by other Data Modelers Reviewing Physical data models to ensure alignment with logical definitions Uploading extension attributes into Collibra Coordinating modelling efforts across the Platform Resolving any overlap/duplication/contradiction in Platform BDM Adopting and enforcing standards and guidelines for BDM Preparing BDM for integration with GDM Attending cross-Platform sessions to coordinate at an enterprise level
Posted 1 week ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 2 weeks ago
9.0 - 14.0 years
20 - 32 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Data Architect/Data Modeler role: Scope & Responsibilities: Enterprise data architecture for a functional domain or for a product group. Design and governs delivery of the Domain Data Architecture / ensures delivery as per design. Ensures consistency in approach for the data modelling of the different solutions of the domain. Design and ensure delivery of data integration across the solutions of the domain. General Expertise: Critical: Methodology expertise on data architecture and modeling from business requirements and functional specifications to data modeling. Critical: Data warehousing and Business intelligence data products modeling (Inmon/Kimball/Data Vault/Codd modeling patterns). Business/Functional knowledge of the domain. This requires business terminology understanding, knowledge of business processes related to the domain, awareness of key principles and objectives, business trends and evolution. Master data management / data management and stewardship processes awareness. Data persistency technologies knowledge: SQL (Ansi-2003 for structured relational data querying & Ansi-2023 for XML, JSON, Property Graph querying) / Snowflake specificities, database structures for performance optimization. NoSQL – Other data persistency technologies awareness. Proficient level of Business English & “Technical writing” Nice to have: Project delivery expertise through agile approach and methodologies experience/knowledge: Scrum, SAFe 5.0, Product-based-organization. Technical Stack expertise: SAP Power Designer modeling (CDM, LDM, PDM) Snowflake General Concepts and specifically DDL & DML, Snow Sight, Data Exchange/Data Sharing concepts AWS S3 & Athena (as a query user) Confluence & Jira (as a contributing user) Nice to have: Bitbucket (as a basic user)
Posted 2 weeks ago
15.0 - 20.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and efficient, while also addressing any challenges that arise during the development process. Your role will be pivotal in shaping the data landscape of the organization, enabling data-driven decision-making and fostering innovation through effective data management practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Continuously assess and improve data architecture practices to align with industry standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data storage solutions and architectures.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Data Engineering Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various teams to ensure that the data architecture is robust, scalable, and efficient, while also addressing any challenges that arise during the development process. Your role will be pivotal in guiding the team towards best practices in data management and architecture, ultimately contributing to the success of the application and the organization as a whole. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and ETL processes.- Familiarity with cloud data platforms and services.- Ability to design and implement data storage solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 25.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, including business representatives, data owners, and architects to model existing and new data, ensuring alignment with business requirements and data architecture principles. Roles & Responsibilities:- Expected to be a SME with deep knowledge and experience.- Should have Influencing and Advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead data modeling efforts for various projects.- Develop and maintain data models for databases.- Ensure data models are aligned with business requirements and data architecture principles. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles.- Strong understanding of data modeling concepts.- Experience with data modeling tools such as ERwin or PowerDesigner.- Knowledge of database management systems like Oracle, SQL Server, or MySQL.- Hands-on experience in designing and implementing data models.- Ability to analyze complex data requirements and translate them into data models. Additional Information:- The candidate should have a minimum of 15 years of experience in Data Architecture Principles.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Chennai, Guindy, Chenai
Work from Office
Data Modeller Chennai - Guindy, India Information Technology 17074 Overview A Data Modeller is responsible for designing, implementing, and managing data models that support the strategic and operational needs of an organization. This role involves translating business requirements into data structures, ensuring consistency, accuracy, and efficiency in data storage and retrieval processes. Responsibilities Develop and maintain conceptual, logical, and physical data models. Collaborate with business analysts, data architects, and stakeholders to gather data requirements. Translate business needs into efficient database designs. Optimize and refine existing data models to support analytics and reporting. Ensure data models support data governance, quality, and security standards. Work closely with database developers and administrators on implementation. Document data models, metadata, and data flows. Requirements Bachelors or Masters degree in Computer Science, Information Systems, Data Science, or related field. Data Modeling Tools: ER/Studio, ERwin, SQL Developer Data Modeler, or similar. Database Technologies: Proficiency in SQL and familiarity with databases like Oracle, SQL Server, MySQL, PostgreSQL. Data Warehousing: Experience with dimensional modeling, star and snowflake schemas. ETL Processes: Knowledge of Extract, Transform, Load processes and tools. Cloud Platforms: Familiarity with cloud data services (e.g., AWS Redshift, Azure Synapse, Google BigQuery). Metadata Management & Data Governance: Understanding of data cataloging and governance principles. Strong analytical and problem-solving skills. Excellent communication skills to work with business stakeholders and technical teams. Ability to document models clearly and explain complex data relationships. 5+ years in data modeling, data architecture, or related roles. Experience working in Agile or DevOps environments is often preferred. Understanding of normalization/denormalization. Experience with business intelligence and reporting tools. Familiarity with master data management (MDM) principles.
Posted 2 weeks ago
7.0 - 11.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with business objectives and supports efficient data management and retrieval processes. You will collaborate with various stakeholders to understand their data needs and translate them into effective data solutions, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI).- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and ETL processes.- Familiarity with cloud data storage solutions and architectures.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Power Business Intelligence (BI).- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
Synechron is seeking a skilled Oracle/SQL Developer with expertise in Informatica Intelligent Cloud Services (IICS) to join our data engineering team. The successful candidate will be responsible for designing, developing, and maintaining robust database solutions and ETL workflows that support business intelligence and data integration initiatives. This role is vital in enabling data-driven decision-making by ensuring high-quality, reliable, and efficient data processing across the organization. Software Requirements Required: Oracle SQL (latest stable versions or as specified by project) Informatica Intelligent Cloud Services (IICS) for data integration and ETL processes SQL development tools (e.g., SQL Developer, TOAD, or similar) Version control systems (e.g., Git) Data modeling and diagramming tools (e.g., ERwin, Visio) Preferred: Data warehousing tools and technologies Cloud platforms (AWS, Azure, GCP) familiarity for deployment considerations Data governance and data quality tools Overall Responsibilities Design, develop, and optimize database structures using Oracle SQL to ensure maximum performance and reliability Develop, modify, and troubleshoot ETL workflows using IICS to extract, transform, and load data from various sources Collaborate with business analysts, data architects, and stakeholders to interpret requirements into technical specifications and data models Monitor, troubleshoot, and refine ETL processes to ensure data accuracy, integrity, and timeliness Document database schemas, ETL workflows, and data integration processes for operational clarity and future reference Conduct code reviews and promote best practices in data development and implementation Stay current with emerging data management tools, techniques, and industry standards to continuously improve data solutions Support data governance and quality initiatives, ensuring compliance with organizational standards Performance outcomes: Reliable and high-performance database solutions ensuring rapid data retrieval and processing Effective ETL workflows with minimal downtime and data inconsistencies Clear documentation and adherence to best practices facilitating easy knowledge transfer and maintenance Positive stakeholder feedback on data accuracy and timeliness Technical Skills (By Category) Programming Languages: Mandatory: SQL, PL/SQL (Oracle) Preferred: Data scripting languages such as Python, Shell scripting for automation Databases/Data Management: Oracle Database (latest versions preferred), data modeling, indexing, and performance optimization Understanding of data warehousing concepts, star schema, data mart design Cloud Technologies: Cloud data integration and deployment experience (AWS, Azure, GCP) is a plus Frameworks and Libraries: Familiarity with ETL frameworks, especially Informatica IICS platform, and related tools Development Tools & Methodologies: SQL development environments (e.g., Oracle SQL Developer, TOAD) Version control: Git, GitHub, or equivalent Agile and scrum methodologies for project management and delivery Security Protocols: Basic understanding of data security, encryption, and governance strategies Experience Requirements Minimum of 4+ years experience in Oracle SQL development and database administration Proven experience with ETL design, development, and management using Informatica Intelligent Cloud Services (IICS) Familiarity with data warehousing concepts, including data modeling and data quality standards Experience with data integration involving multiple data sources and complex workflows Knowledge of Agile project environments is a plus Alternative pathways: Candidates with extensive hands-on experience with Oracle databases and ETL development using other tools, who are willing to adapt to IICS, may also be considered. Day-to-Day Activities Develop and optimize SQL and PL/SQL scripts for data extraction, transformation, and loading Design and implement scalable ETL workflows within IICS, monitoring their performance and integrity Collaborate with analysts and stakeholders to gather and refine data requirements Troubleshoot issues within database schemas and ETL processes, providing timely resolutions Participate in code reviews, ensuring adherence to coding standards and best practices Maintain detailed documentation for all data schemas, workflows, and procedures Conduct regular reviews of data quality and implement enhancements as needed Stay informed with industry best practices and advances in database and data integration technologies Qualifications Bachelors degree in Computer Science, Information Technology, or related field Hands-on experience with Informatica Intelligent Cloud Services (IICS) and ETL development Certifications such as Oracle OCA/OCP or Informatica IICS certifications are a plus Knowledge of data warehousing, data modeling, and data governance best practices Continuous learning mindset and ability to adapt to evolving technology landscapes Professional Competencies Strong analytical and problem-solving skills, with attention to detail Effective communication skills for collaborating with technical and non-technical stakeholders Ability to work independently and as part of a team to meet deadlines and project goals Adaptability to rapidly changing environments and new tools or methodologies Commitment to high-quality work and process improvement Strong organizational skills for managing multiple tasks concurrently
Posted 2 weeks ago
1.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
We are looking for an experienced and highly skilled Data Architect with 5-7 years of expertise in Data Architecture, Data Governance, and Data Modeling to join our growing team This role involves designing and implementing comprehensive data solutions, ensuring data quality, consistency, and governance across the organization You will collaborate with various stakeholders to create robust data architectures, enforce data governance policies, and develop high-quality data models that support business intelligence and analytics. Responsibilities Design and implement enterprise-level data architectures to support business operations, analytics, and reporting needs. Ensure scalability, reliability, and security in data architecture solutions, optimizing for performance and efficiency. Work with data engineering teams to select the right data platforms, technologies, and methodologies for system design and development. Oversee the integration of various data sources into a centralized data repository, ensuring efficient data flow across systems. Develop and implement data governance frameworks and policies to ensure high-quality data management across the organization. Define and enforce standards for data security, privacy, and compliance (eg., GDPR, CCPA). Ensure data ownership, stewardship, and accountability are clearly defined. Monitor data usage and quality, identifying areas for improvement and ensuring data consistency across systems. Collaborate with business and IT teams to ensure data is trustworthy, accessible, and compliant with organizational policies. Create and maintain logical, physical, and conceptual data models for transactional and analytical databases. Develop complex data models that reflect business processes, ensuring alignment between technical and business needs. Use industry-standard methodologies to ensure data models are scalable, flexible, and efficient. Collaborate with cross-functional teams to validate data models and ensure alignment with business requirements and data strategy. Document and maintain metadata, schema definitions, and data dictionaries to ensure consistency and clarity. Design, implement, and optimize data architectures to support data integration, data analytics, and business intelligence. Lead efforts to establish data governance processes, policies, and practices across the organization. Implement data quality and consistency frameworks to ensure the integrity of the data across all platforms and systems. Work with stakeholders to gather and analyze business requirements and translate them into effective data architecture and modeling solutions. Provide expertise in selecting the appropriate tools and technologies for data storage, retrieval, and integration. Collaborate with data engineers, data analysts, and other teams to ensure effective data architecture implementation and governance. Document data models, architecture designs, and governance frameworks to maintain clarity and alignment with organizational objectives. Keep up to date with emerging data management trends and technologies, recommending innovative solutions to improve data quality and management. Requirements Bachelor's degree in Computer Science, Information Technology, Data Science, or related field (Master's degree preferred). 5-6 years of proven experience in Data Architecture, Data Governance, and Data Modeling. Strong hands-on experience with data modeling tools such as Erwin, IBM InfoSphere, Microsoft Visio, or similar tools. In-depth knowledge of data governance frameworks, including best practices and standards (e g., DAMA DMBOK). Proficient in data architecture principles, including designing scalable and high-performance data storage systems. Experience with SQL and NoSQL databases, data warehousing, ETL processes, and cloud-based platforms (AWS, Azure, or Google Cloud). Strong understanding of data privacy, security, and compliance standards (eg., GDPR, HIPAA, CCPA). Familiarity with big data technologies like Hadoop, Spark, or Kafka is a plus. Excellent communication skills to present complex data architecture concepts to both technical and non-technical stakeholders.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2