Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 - 12.0 years
25 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 1 week ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.
Posted 1 week ago
8.0 - 12.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
6.0 - 10.0 years
12 - 18 Lacs
Hyderabad
Hybrid
Data Modeller Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking Data Modeler for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and implementing data models, ensuring data integrity and optimization, collaborating with stakeholders, supporting data queries, and ensuring data governance and compliance. Technical Requirements: Experience in Relational/Normalized Data Modelling Strong Business Analysis skills Proficiency in Data Analysis Ability to develop conceptual, logical, and physical data models to support business requirements Experience in developing enterprise data models for OLTP, Analytics/AI/ML environments Familiarity with data modelling tools such as ER Studio, EA Sparx, Collibra Knowledge of metadata modelling, management, and tooling Implementation experience of physical models for interfaces, data stores, and cloud-based solutions Participation in governing data models Familiarity with Finance and Reference Data Models such as BIAN, ISO20022, FIBO, FSLDM Functional Requirements: Analyzing Data Requirements Creating/Modifying Entities and Attributes in BDM Identifying Reference Domains Peer reviewing BDM changes made by other Data Modelers Reviewing Physical data models to ensure alignment with logical definitions Uploading extension attributes into Collibra Coordinating modelling efforts across the Platform Resolving any overlap/duplication/contradiction in Platform BDM Adopting and enforcing standards and guidelines for BDM Preparing BDM for integration with GDM Attending cross-Platform sessions to coordinate at an enterprise level. This role offers a compelling opportunity for a seasoned Data Modeler to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification Engineering Grad / Postgraduate CRITERIA Proficient in relational data modeling and business analysis with metadata management experience. Skilled in developing conceptual, logical, and physical data models for varied environments. Familiarity with data modeling tools like ER Studio, EA Sparx, and Collibra. Experience in creating and governing data models, including financial and reference models. Ability to coordinate and resolve model discrepancies across platforms with a focus on integration. Relevant Experience: 6-9 years
Posted 1 week ago
5.0 - 7.0 years
6 - 10 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
More than 5+ years of experience in data modelling – designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. Must Have: Data Modeling, Data Modeling Tool experience, SQL Nice to Have: SAP HANA, Data Warehouse, Databricks, CPG"
Posted 1 week ago
6.0 - 9.0 years
10 - 20 Lacs
Hyderabad
Hybrid
Role & responsibilities Preferred candidate profile Job Title: Data Modeller Location : Hyderabad (Hybrid) Criteria Proficient in relational data modeling and business analysis with metadata management experience. Skilled in developing conceptual, logical, and physical data models for varied environments. Familiarity with data modeling tools like ER Studio, EA Sparx, and Collibra. Experience in creating and governing data models, including financial and reference models. Ability to coordinate and resolve model discrepancies across platforms with a focus on integration. Relevant Experience: 6-9 year Technical Requirements: Experience in Relational/Normalized Data Modelling Strong Business Analysis skills Proficiency in Data Analysis Ability to develop conceptual, logical, and physical data models to support business requirements Experience in developing enterprise data models for OLTP, Analytics/AI/ML environments Familiarity with data modelling tools such as ER Studio, EA Sparx, Collibra Knowledge of metadata modelling, management, and tooling Implementation experience of physical models for interfaces, data stores, and cloud-based solutions Participation in governing data models Familiarity with Finance and Reference Data Models such as BIAN, ISO20022, FIBO, FSLDM Functional Requirements: Analyzing Data Requirements Creating/Modifying Entities and Attributes in BDM Identifying Reference Domains Peer reviewing BDM changes made by other Data Modelers Reviewing Physical data models to ensure alignment with logical definitions Uploading extension attributes into Collibra Coordinating modelling efforts across the Platform Resolving any overlap/duplication/contradiction in Platform BDM Adopting and enforcing standards and guidelines for BDM Preparing BDM for integration with GDM Attending cross-Platform sessions to coordinate at an enterprise level
Posted 2 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 2 weeks ago
8.0 - 10.0 years
12 - 15 Lacs
Bengaluru
Work from Office
What you 'll do: Work closely with our product management, internal stakeholders and customers to identify, validate and document new system requests, oversee proper implementation by providing acceptance criteria and act as a liaison between the business users and the developers. Be an integral part of our dynamic agile R&D team, become an expert with our innovative product and contribute to the product's vision. Perform enterprise level and project level data modelling, including model management, consolidation and integration Understand business requirements and translate to Conceptual (CDM), Logical (LDM) and Physical (PDM) data model by using industry standards Managing Data Model for multiple projects and make sure data model in all projects are synchronized and adhering to Enterprise Architecture with proper change management Establish and manage existing standards for naming and abbreviation conventions, data definitions, ownership, documentation, procedures, and techniques Adopt, support, and participate in the implementation of the Enterprise Data Management Strategy Experience in creating P&C, Life & health insurance/BFSI-specific target data models, meta data layer and data marts. Experience in Medallion (Lakehouse) Architecture. Collaborate with Application team to implement data flows, samples and develop conceptual-logical data models Ensure reusability of model and approach in across different business requirements Support data specific system integration and support data migration Good to have experience in modelling MongoDB schema What to Have for this position. Must have Skills. Min 5+ years as data modeler involved in mid- to large-scale system development projects and have experience in Data Analysis, Data Modelling and Data Mart designing. Overall experience of 8+ years Should have Data analysis/profiling and reverse engineering of data experience. Experience of working on a data migration project is a plus Prior experience in BFSI domain [Insurance would be a plus] Should have experience in ER Studio/Toad data Modeler or equivalent tool Should be strong in Data warehousing concepts Should have strong database development skills like complex SQL queries, complex store procedures Should be strong in Medallion (Lakehouse) Architecture. Good verbal and written communication skills in English Ability to work with minimal guidance or supervision in a time critical environment.
Posted 2 weeks ago
8.0 - 13.0 years
20 - 35 Lacs
Pune, Chennai
Work from Office
Azure Data Modeler - candidate will have a strong background in data modeling, database design, and experiencewith Azure data services. requires excellent analytical skills and the ability to translate business requirements into effective data models
Posted 2 weeks ago
2.0 - 6.0 years
10 - 15 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
More than 2+ years of experience in data modelling designing, implementing, and maintaining data models to support data quality, performance and scalability. Proven experience as a Data Modeller and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in HANA and S/4HANA, Retail Data like IRI, Nielsen Retail. Must Have: Data Modeling, Data Modeling Tool experience, SQL Nice to Have: SAP HANA, Data Warehouse, Databricks, CPG" Immediate Joiners only Please apply only if you meet the given requirements
Posted 3 weeks ago
5.0 - 10.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Title: Data Modeler Experience: 5+ Years Location: Hyderabad (WFO). Roles and Responsibilities: Experience in data modelling designing, implementing, and maintaining data models to support data quality, performance, and scalability. Proven experience as a Data Modeler and worked with data analysts, data architects and business stakeholders to ensure data models are aligned to business requirements. Expertise in Azure, Databricks, Data warehousing, ERWIN, and Supply chain background is required. Strong knowledge of data modelling principles and techniques (e.g., ERD, UML). Proficiency with data modelling tools (e.g., ER/Studio, Erwin, IBM Data Architect). Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Solid understanding of data warehousing, ETL processes, and data integration. Able to create and maintain Source to Target mapping [STTM] Document , Bus Matrix Document , etc. Realtime experience working in OLTP & OLAP Database modelling. Additional: Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. Excellent analytical, problem-solving, and communication skills. Ability to work effectively in a collaborative, fast-paced environment.
Posted 3 weeks ago
8.0 - 13.0 years
10 - 12 Lacs
Kolkata, Hyderabad, Pune
Work from Office
Expert in Data Modeling with strong MS SQL skills and hands-on experience in ER Studio. Excellent communication skills to work with US clients and translate business needs into data models. Mail:kowsalya.k@srsinfoway.com
Posted 3 weeks ago
5.0 - 10.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Title: ======= Senior MS BI Developer Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED - Full tax free salary , Depending on Experience Gulf work permit will be sponsored by our client Project duration: ============= 2 Years, Extendable Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or MS - BI Developer with Microsoft Stack / MS - DWH Engineer Job Responsibilities: ================ - Design and develop DWH data flows - Able to build SCD -1 / SCD - 2 / SCD -3 dimensions - Build Cubes - Maintain SSAS / DWH data - Design Microsoft DWH & its ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) Experience: ================ - Experience as DWH developer with Microsoft DWH data flows and cubes - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS / SSAS / SSRS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== MSBI_DEVP_0525 No.of positions: ============ 05 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ MSBI_DEVP_0525 ] as subject
Posted 3 weeks ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 4 weeks ago
5 - 10 years
15 - 30 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Data Modeller. Experience: 5+ Years Skill Set: Data Modelling and SQL. Location: Pune, Hyderabad, Gurgaon Position in brief: We are more looking into technical with a piece of functional knowledge. at least 5 years of hands-on data modeling (conceptual, logical, and physical), data profiling, and data analysis skills SQL It should be Basic to intermediate level/ added advantage someone good with writing complex SQL queries ETL - should have an idea of how ETL process works/ should provide any ETL attributes and partition-related info as part of the data mapping document. Any tool experience is okay- ER Studio, ERWin, Sybase Power Designer. Detailed Job Description We are looking for a passionate Data Analyst/Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight, and insight into business decisions. Responsibilities: Be responsible for gathering requirements from the business team and translating to technical requirements. Should be able to drive the projects and provide guidance wherever needed. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. The candidate must be able to work independently and collaboratively. Work with management to prioritize business and information needs. Requirements: Bachelors or masters degree in computer/data science technical or related experience. 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL/Big data platform technologies, and ETL and data ingestion protocols). Proven working experience as a data analyst/data modeler or a similar role. Technical expertise in designing data models, database design and data analysis. Prior experience involving in the migration of data from legacy systems to new solutions. Good knowledge of metadata management, data modelling, and related tools (Erwin or ER Studio or others) is required. Experience gathering and analysing system/business requirements and providing mapping documents for technical teams. Strong analytical skills with the ability to collect, organize, analyze, and disseminate. significant amounts of information with attention to detail and accuracy Hands-on experience with SQL Problem-solving attitude
Posted 1 month ago
5 - 7 years
0 - 0 Lacs
Mumbai
Work from Office
Role - Fullstack Developer Position Summary: We are seeking a Full Stack Developer to work within the Development team to design and develop a key Private Equity application. This technical hands-on position requires proven development skills, excellent communication skills, problem solving ability, critical thinking, and a commitment to quality deliverables. Responsibilities: - Develop high-quality and scalable solutions using the Python or Java platform - Partner with Technology Business Analyst and Project Manager to develop requirements for custom solutions - Produce appropriate design artifacts - Build custom solutions based on agreed upon requirements, designs and architectures - Participate in performance testing Experience Required: Technical: - 8+ years' experience in developing objected-oriented user facing software, with deep experience in Python or Java (1.8+) and component-based UI frameworks( Angular JS or ExtJS or React) - Experience developing finance models (e.g. forecasting models) a plus - Experience using test driven development methodologies - Strong SQL (required) and ER modelling skills(a plus), Messaging Queue (a plus) - AKS (Azure deployment with gitbhub) knowledge is mandatory. Proven skills in designing and building reliable and scalable production services using service-oriented architecture (a plus) - Experience in building features that are simple, easy to comprehend, performant and reliable which improves user experience (a plus) Soft: - Excellent analytical skills and detail oriented - Excellent communications skills, with the ability to work with business users as well as other members of the technical staff - Must be self-directed and motivated Required Skills Java/Python,AngularJS/ExtJS/React,SQL,AKSknowledge
Posted 2 months ago
5 - 10 years
20 - 25 Lacs
Pune, Hyderabad
Hybrid
Job Title: Databricks Data Modeler Location : Pune / Hyderabad Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.
Posted 2 months ago
6 - 11 years
25 - 35 Lacs
Pune, Delhi NCR, India
Hybrid
Exp - 6 to 10 Yrs Role - Data Modeller Position - Permanent Job Locations - DELHI/NCR (Remote), PUNE (Remote), CHENNAI (Hybrid), HYDERABAD (Hybrid), BANGALORE (Hybrid) Experience & Skills: 6+ years of experience in strong Data Modeling and Data Warehousing skills Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP)
Posted 2 months ago
10 - 18 years
30 - 45 Lacs
Pune, Delhi NCR, India
Hybrid
Exp - 10 to 18 Years Role - Data Modeling Architect / Senior Architect / Principal Architect Position - Permanent Locations - Hyderabad (Hybrid), Bangalore (Hybrid), Chennai (Hybrid), Pune (Remote till office opens), Delhi NCR (Remote till office opens) JD: 10+ years of experience in Data Warehousing & Data Modeling - Dimensional/Relational/Physical/Logical. Decent SQL knowledge Able to suggest modeling approaches & Solutioning for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Good experience in stakeholder management Decent communication and experience in leading the team
Posted 2 months ago
10 - 15 years
12 - 17 Lacs
Mumbai
Work from Office
About the company: With over 2.5 crore customers, over 5,000 distribution points and nearly 2,000 branches, IndusInd Bank is a universal bank with a widespread banking footprint across the country. IndusInd offers a wide array of products and services for individuals and corporates including microfinance, personal loans, personal and commercial vehicles loans, credit cards, SME loans. Over the years, IndusInd has grown ceaselessly and dynamically, driven by zeal to offer our customers banking services at par with the highest quality standards in the industry. IndusInd is a pioneer in digital first solutions to bring together the power of next-gen digital product stack, customer excellence and trust of an established bank. Job Purpose: To work on implementing data modeling solutions To design data flow and structure to reduce data redundancy and improving data movement amongst systems defining a data lineage To work in the Azure Data Warehouse To work with large data volume of data integration Experience With overall experience between 10 to 15 years, applicant must have minimum 8 to 11 years of hard core professional experience in data modeling for large Data Warehouse with multiple Sources. Technical Skills Expertise in core skill of data modeling principles/methods including conceptual, logical & physical Data Models Ability to utilize BI tools like Power BI, Tableau, etc to represent insights Experience in translating/mapping relational data models into XML and Schemas Expert knowledge of metadata management, relational & data modeling tools like ER Studio, Erwin or others. Hands-on experience in relational, dimensional and/or analytical experience (using RDBMS, dimensional, NoSQL, ETL and data ingestion protocols). Very strong in SQL queries Expertise in performance tuning of SQL queries. Ability to analyse source system and create Source to Target mapping. Ability to understand the business use case and create data models or joined data in Datawarehouse. Preferred experience in banking domain and experience in building data models/marts for various banking functions. Good to have knowledge of Azure powershell scripting or Python scripting for data transformation in ADF SSIS, SSAS, BI tools like Power BI Azure PaaS components like Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Synapse (DWH), Polybase, ExpressRoute tunneling, etc. API integration Responsibility Understanding the existing data model, existing data warehouse design, functional domain subject areas of data, documenting the same with as is architecture and proposed one. Understanding existing ETL process, various sources and analyzing, documenting the best approach to design logical data model where required Work with development team to implement the proposed data model into physical data model, build data flows Work with development team to optimize the database structure with best practices applying optimization methods. Analyze, document and implement to re-use of data model for new initiatives. Will interact with stakeholder, Users, other IT teams to understand the eco system and analyze for solutions Work on user requirements and create queries for creating consumption views for users from the existing DW data. Will train and lead a small team of data engineers. Qualifications Bachelors of Computer Science or Equivalent Should have certification done on Data Modeling and Data Analyst. Good to have a certification of Azure Fundamental and Azure Engineer courses (AZ900 or DP200/201) Behavioral Competencies Should have excellent problem-solving and time management skills Strong analytical thinking skills Applicant should have excellent communication skill and process oriented with flexible execution mindset. Strategic Thinking with Research and Development mindset. Clear and demonstrative communication Efficiently identify and solves issues Identify, track and escalate risks in a timely manner Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on IndusInd Bank Candidates selected after the screening rounds will be processed further by IndusInd Bank
Posted 2 months ago
7 - 12 years
30 - 45 Lacs
Pune, Bengaluru, Gurgaon
Work from Office
Responsibilities: Design conceptual, logical, and physical data models for a unified data platform. Define data product structures and ensure alignment with business needs. Collaborate with data engineers and architects to optimize database performance. Work on data governance, metadata management, and data lineage. Support testing and validation of data models with stakeholders. Required Skills: Expertise in data modeling tools (Erwin, ER/Studio, or equivalent). Strong understanding of relational and NoSQL databases. Experience with Azure based data solutions and ADLS 2.0. Knowledge of healthcare data and interoperability standards
Posted 3 months ago
6 - 10 years
15 - 30 Lacs
Pune, Bengaluru, Gurgaon
Hybrid
Role:- Data Modeler. Location : Bangalore, Pune, Gurugram. Work Mode:-Hybrid. Please Find the JD in Below Job Description: Data Modeler 6+ years of experience, with at least 3 years as a data modeler, Data Vault, data architect or similar roles Analyze and translate business needs to long term, optimized data models. Design, develop and maintain comprehensive conceptual, logical, and physical data models. Develop best practices for data standards and data architecture. Proficiency in data modelling tools such as ERwin, ER/Studio or similar Strong understanding of data warehousing, data vault concepts Proficient in data analysis and profiling on database systems (eg. SQL, Databricks) Good knowledge of data governance and data quality best practices Experience with cloud platforms (such as AWS, Azure, Google Cloud), NoSQL databases (like MongoDB) will be an added advantage. Excellent communication and interpersonal skills. No constraint on working from ODC if there is a client ask. Flexible on work timings based on engagement need. Mandatory : OLTP/OLAP concept ,Relational and Dimensional Modelling STAR/ Snowflake schema knowledge, Data Vault strong knowledge on writing SQL query and Entity Relationship concepts. Must have experience with RDBMS- like oracle, MS SQL Data Modeling (Conceptual, Logical, Physical). Good in analyzing data.
Posted 3 months ago
7 - 12 years
35 - 60 Lacs
Pune, Delhi NCR, Hyderabad
Hybrid
Data Modeller Permanent Chennai / Bangalore / Pune / Hyderabad / Delhi NCR Preferred candidate profile 6- 15+ years of experience in Data Modeling. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Required Skills: Strong experience in SQL (tables, Attributes, joins, queries), Strong DWH Skills, Rational/Dimensional/ER Modeling, Data Valut modeling, OLAP/OLTP, Facts and Dimensions, Normalisation (3nf), Data marts, Schemas, Keys, Dimension types, M:M relationship & Forward /Reverse Engineering etc. Good to Have: Data valut, Tool experience (Erwin, ER Studio, Visio, Power Designer) & Cloud etc. If Interested, kindly share me your updated profile along with below details. Full Name: Contact #: Email ID: Total experience: Key skills: Current Location: Relocation: Current Employer: CTC (Including Variable): ECTC: Notice Period: Holding any offer:
Posted 3 months ago
8 - 13 years
27 - 42 Lacs
Chennai, Pune, Kolkata
Hybrid
Job Description Role requires him/her to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will work closely with data architects to design bespoke databases using a mixture of conceptual, physical, and logical data models. Job title Data Modeler Hybrid role from Location: Bangalore, Chennai, Gurgaon, Pune, Kolkata Interviews: 3 rounds of 30 ~ 45 Minutes video-based Teams interviews Employment Type: Permanent Full Time with Tredence Total Experience 9~13 years Required Skills Data Modeling, Dimensional modeling, ErWIN, Data Management, RDBMS, SQL/NoSQL, ETL What we look for: BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 9~13 years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.
Posted 3 months ago
5 - 6 years
10 - 14 Lacs
Hyderabad
Hybrid
Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. • Implement data models for relational, dimensional, and data lake environments on target platforms. • Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. • Define and govern data modeling standards, tools, and best practices. • Optimize data structures for query performance and scalability. • Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills 5+ years of data modeling experience with relational and NoSQL platforms. • Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. • Experience with Microsoft Fabric, data lakes, and BI data structures. • Strong analytical and communication skills for team collaboration. • Attention to detail with a focus on performance and consistency.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2