Jobs
Interviews

50 Er Studio Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Principal Data Engineer at Brillio, you will play a key role in leveraging your expertise in Data Modeling, particularly with tools like ER Studio and ER Win. Brillio, known for its digital technology services and partnership with Fortune 1000 companies, is committed to transforming disruptions into competitive advantages through innovative digital solutions. With over 10 years of IT experience and at least 2 years of hands-on experience in Snowflake, you will be responsible for building and maintaining data models that support the organization's Data Lake/ODS, ETL processes, and data warehousing needs. Your ability to collaborate closely with clients to deliver physical and logical model solutions will be critical to the success of various projects. In this role, you will demonstrate your expertise in Data Modeling concepts at an advanced level, with a focus on modeling in large volume-based environments. Your experience with tools like ER Studio and your overall understanding of database technologies, data warehouses, and analytics will be essential in designing and implementing effective data models. Additionally, your strong skills in Entity Relationship Modeling, knowledge of database design and administration, and proficiency in SQL query development will enable you to contribute to the design and optimization of data structures, including Star Schema design. Your leadership abilities and excellent communication skills will be instrumental in leading teams and ensuring the successful implementation of data modeling solutions. While experience with AWS ecosystems is a plus, your dedication to staying at the forefront of technological advancements and your passion for delivering exceptional client experiences will make you a valuable addition to Brillio's team of "Brillians." Join us in our mission to create innovative digital solutions and make a difference in the world of technology.,

Posted 2 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

Your Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL). Collaborate with solution teams and Data Architects to implement data strategies, build data flows, and develop logical/physical data models. Work with Data Architects to define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Engage in hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Proactively and independently address project requirements and articulate issues/challenges to reduce project delivery risks. Your Profile Bachelor's degree in computer/data science technical or related experience. Possess 7+ years of hands-on relational, dimensional, and/or analytic experience utilizing RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Demonstrated experience with data warehouse, Data Lake, and enterprise big data platforms in multi-data-center contexts. Proficient in metadata management, data modeling, and related tools (e.g., Erwin, ER Studio). Preferred experience with services in Azure/Azure Databricks (Azure Data Factory, Azure Data Lake Storage, Azure Synapse & Azure Databricks) and working on SAP Datasphere is a plus. Experience in team management, communication, and presentation. Understanding of agile delivery methodology and experience working in a scrum environment. Ability to translate business needs into data vault and dimensional data models supporting long-term solutions. Collaborate with the Application Development team to implement data strategies, create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Maintain logical and physical data models along with corresponding metadata. Develop best practices for standard naming conventions and coding practices to ensure data model consistency. Recommend opportunities for data model reuse in new environments. Perform reverse engineering of physical data models from databases and SQL scripts. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Develop data models according to company standards. Guide System Analysts, Engineers, Programmers, and others on project limitations and capabilities, performance requirements, and interfaces. Review modifications to existing data models to improve efficiency and performance. Examine new application design and recommend corrections as needed. #IncludingYou Diversity, equity, inclusion, and belonging are cornerstones of ADM's efforts to continue innovating, driving growth, and delivering outstanding performance. ADM is committed to attracting and retaining a diverse workforce and creating welcoming, inclusive work environments that enable every ADM colleague to feel comfortable, make meaningful contributions, and grow their career. ADM values the unique backgrounds and experiences that each person brings to the organization, understanding that diversity of perspectives makes us stronger together. For more information regarding ADM's efforts to advance Diversity, Equity, Inclusion & Belonging, please visit the website: Diversity, Equity and Inclusion | ADM. About ADM At ADM, the power of nature is unlocked to provide access to nutrition worldwide. With industry-advancing innovations, a comprehensive portfolio of ingredients and solutions catering to diverse tastes, and a commitment to sustainability, ADM offers customers an edge in addressing nutritional challenges. As a global leader in human and animal nutrition and the premier agricultural origination and processing company worldwide, ADM's capabilities in insights, facilities, and logistical expertise are unparalleled. From ideation to solution, ADM enriches the quality of life globally. Learn more at www.adm.com.,

Posted 2 days ago

Apply

2.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Principal Data Engineer at Brillio, you will play a crucial role in leveraging your expertise in data modeling, particularly with tools like ER Studio and ER Win. Your specialization lies in transforming disruptive technologies into a competitive advantage for Fortune 1000 companies through innovative digital adoption. Brillio prides itself on being a rapidly growing digital technology service provider that excels in integrating cutting-edge digital skills with a client-centric approach. As a preferred employer, Brillio attracts top talent by offering opportunities to work on exclusive digital projects and engage with groundbreaking technologies. To excel in this role, you should have over 10 years of IT experience, with a minimum of 2 years dedicated to Snowflake and hands-on modeling experience. Your proficiency in Data Lake/ODS, ETL concepts, and data warehousing principles will be critical in delivering effective solutions to clients. Collaboration with clients to drive both physical and logical model solutions will be a key aspect of your responsibilities. Your technical skills should encompass advanced data modeling concepts, experience in modeling large volumes of data, and proficiency in tools like ER Studio. A solid grasp of database concepts, including data warehouses, reporting, and analytics, will be essential. Familiarity with platforms like SQLDBM and expertise in entity relationship modeling will further strengthen your profile. Moreover, your communication skills should be excellent, enabling you to lead teams effectively and facilitate seamless collaboration with clients. While exposure to AWS ecosystems is a plus, your ability to design and administer databases, develop SQL queries for analysis, and implement data modeling for star schema designs will be instrumental in your success as a Principal Data Engineer at Brillio.,

Posted 2 days ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

The ideal candidate for this position should have 8-12 years of experience and possess a strong understanding and hands-on experience with Microsoft Fabric. You will be responsible for designing and implementing end-to-end data solutions on Microsoft Azure, which includes data lakes, data warehouses, and ETL/ELT processes. Your role will involve developing scalable and efficient data architectures to support large-scale data processing and analytics workloads. Ensuring high performance, security, and compliance within Azure data solutions will be a key aspect of this role. You should have knowledge of various techniques such as lakehouse and warehouse, along with experience in implementing them. Additionally, you will be required to evaluate and select appropriate Azure services like Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks, Unity Catalog, and Azure Data Factory. Deep knowledge and hands-on experience with these Azure Data Services are essential. Collaborating closely with business and technical teams to understand and translate data needs into robust and scalable data architecture solutions will be part of your responsibilities. You should also have experience in data governance, data privacy, and compliance requirements. Excellent communication and interpersonal skills are necessary for effective collaboration with cross-functional teams. In this role, you will provide expertise and leadership to the development team implementing data engineering solutions. Working with Data Scientists, Analysts, and other stakeholders to ensure data architectures align with business goals and data analysis requirements is crucial. Optimizing cloud-based data infrastructure for performance, cost-effectiveness, and scalability will be another key responsibility. Experience in programming languages like SQL, Python, and Scala is required. Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms is preferred. Familiarity with Azure DevOps and CI/CD pipeline development is beneficial. An in-depth understanding of database structure principles and distributed data processing of big data batch or streaming pipelines is essential. Knowledge of data visualization tools such as Power BI and Tableau, along with data modeling and strong analytics skills is expected. The candidate should be able to convert OLTP data structures into Star Schema and ideally have DBT experience along with data modeling experience. A problem-solving attitude, self-motivation, attention to detail, and effective task prioritization are essential qualities for this role. At Hitachi, attitude and aptitude are highly valued as collaboration is key. While not all skills are required, experience with Azure SQL Data Warehouse, Azure Data Factory, Azure Data Lake, Azure Analysis Services, Databricks/Spark, Python or Scala, data modeling, Power BI, and database migration are desirable. Designing conceptual, logical, and physical data models using tools like ER Studio and Erwin is a plus.,

Posted 2 days ago

Apply

8.0 - 14.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a Data Modeller at ReBIT, you will be responsible for technology delivery by collaborating with business stakeholders, RBI departments, and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models, data migration, and generate business reports. You will play a crucial role in identifying the architecture, infrastructure, interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. The ideal candidate should possess 8-14 years of experience in the IT industry with hands-on experience in relational, dimensional, and/or analytic data using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Experience in data technologies such as SQL, Pl/SQL, Oracle Exadata, MongoDB, Cassandra, and Hadoop is required. Additionally, expertise in designing enterprise-grade application data models/structures, particularly in the BFSI domain, is essential. You should have a good understanding of metadata management, data modeling, and related tools like Oracle SQL Developer Data Modeler, Erwin, or ER Studio. Your role will involve working on modeling, design, configuration, installation, and performance tuning to ensure the successful delivery of applications in the BFSI domain. Furthermore, you will be responsible for building best-in-class performance-optimized relational/non-relational database structures/models and creating ER diagrams, data flow diagrams, and dimensional diagrams for relational systems/data warehouse. In this role, you will need to work proactively and independently to address project requirements and effectively communicate issues/challenges to reduce project delivery risks. You will be a key player in driving the data modeling process, adhering to design standards, tools, best practices, and related development for enterprise data models. If you are a data modeling professional with a passion for delivering innovative solutions in a collaborative environment, this role at ReBIT in Navi Mumbai offers an exciting opportunity to contribute to the BFSI domain while honing your skills in data modeling and technology delivery.,

Posted 3 days ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a Bachelor's or Master's degree in computer/data science or related field, or equivalent technical experience. With a minimum of 7 years of hands-on experience in relational, dimensional, and/or analytic data modeling. Your expertise should include a strong command of SQL and practical experience working with databases such as Oracle, PostgreSQL, Snowflake, and Teradata. Your responsibilities will involve hands-on activities like modeling, design, configuration, installation, performance tuning, and sandbox Proof of Concept (POC). Proficiency in metadata management, data modeling, and related tools such as Erwin or ER Studio is essential. You should be experienced in data modeling, ER diagramming, and designing enterprise software for OLTP (relational) and analytical systems. It is crucial that you possess a solid understanding of data modeling principles, standard methodologies, semantic data modeling concepts, and multi-Fact models. You must be capable of defining data modeling standards, guidelines, and assisting teams in implementing complex data-driven solutions at a large scale globally. Your experience should also include supporting history handling, time series data warehousing, and data transformations through data modeling activities. Additionally, you should have the ability to quickly comprehend technological and business concepts, key domain entities, and communicate effectively with engineers, architects, and product management teams. Your role will involve assessing the accuracy, completeness, and consistency of data models while ensuring the maintenance of relevant documentation. Experience with data cataloging tools like Alation and Collibra to drive data lineage is preferred. A strong understanding of data governance processes and metadata repositories is also expected. You should be comfortable working in a fast-paced environment with short release cycles and an iterative development methodology, handling multiple projects simultaneously with minimal specifications. Having knowledge of Python and experience with Informatica would be considered advantageous for this position. Your excellent communication and documentation skills will be essential in this role.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are Kenvue, a company dedicated to the power of everyday care and rooted in a rich heritage and scientific expertise. With iconic brands like NEUTROGENA, AVEENO, TYLENOL, LISTERINE, JOHNSONS, and BAND-AID, you are committed to delivering the best products to customers globally. As a Kenvuer, you are part of a diverse team of 22,000 individuals focused on insights, innovation, and making a positive impact on millions of lives daily. As a Senior Data Modeler at Kenvue Data Platforms, based in Bengaluru, you will collaborate with various teams including Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. Your role involves developing solution architectures, defining data models, and ensuring the acquisition, ingestion processes, and reporting requirements are met efficiently. Key Responsibilities: - Provide expertise in data architecture and modeling to build next-generation product capabilities that drive business growth. - Collaborate with Business Analytics leaders to translate business needs into optimal architecture designs. - Design scalable and reusable data models adhering to FAIR principles for different functional areas. - Work closely with data engineers, solution architects, and stakeholders to optimize data models. - Create and maintain Metadata Rules, Data Dictionaries, and lineage details for data models. Qualifications: - Undergraduate degree in Technology, Computer Science, or related fields; advanced degree preferred. - Strong interpersonal and communication skills to effectively collaborate with various stakeholders. - 3+ years of experience in data architecture & modeling in Consumer/Healthcare Goods companies. - 5+ years of progressive experience in Data & Analytics initiatives. - Hands-on experience in Cloud Architecture (Azure, GCP, AWS) and cloud-based databases. - Expertise in SQL, Erwin / ER Studio, data modeling techniques, and methodologies. - Familiarity with noSQL, graphDB databases, and data catalogs. - Experience in Agile methodology (Scrum/Kanban) within DevSecOps model. - Proven track record of contributing to high-profile projects with changing requirements. Join Kenvue in shaping the future and making a difference in the world of data and analytics. Proud to be an equal opportunity employer, Kenvue values diversity and inclusion in its workforce. Location: Bangalore, India Job Function: Digital Product Development,

Posted 4 days ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

19 - 20 Lacs

Bengaluru

Remote

Hi Candidates, we have job openings in one of our MNC Company Interested candidates can apply here and share details to chandrakala.c@i-q.co Note: NP-0-15 days only serving Role & responsibilities We are looking for Data Managers Work Exp: Min 5 yrs. (mandatory) Location: Remote (India) JD: The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. The successful candidate will: - Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). - Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities - Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). - Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. - Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. - Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills - Bachelors or masters degree in computer/data science technical or related experience. - 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). - Experience with data warehouse, data lake, and enterprise big data platforms in multi-datacenter contexts required. -Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. - Experience in team management, communication, and presentation. Preferred candidate profile

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Pune, Chennai

Hybrid

Role & responsibilities We are looking for a skilled Data Modeller with 5 to 8 years of hands-on experience in designing and maintaining robust data models for enterprise data solutions. The ideal candidate has a strong foundation in dimensional, relational, and semantic data modelling and is ready to expand into data engineering technologies and practices . This is a unique opportunity to influence enterprise-wide data architecture while growing your career in modern data engineering. Required Skills & Experience: 5 to 8 years of experience in data modelling with tools such as Erwin, ER/Studio, dbt, PowerDesigner , or equivalent. Strong understanding of relational databases, star/snowflake schemas, normalization, and denormalization . Experience working with SQL , stored procedures , and performance tuning of data queries. Exposure to data warehousing concepts and BI tools (e.g., Tableau, Power BI, Looker). Familiarity with data governance , metadata management , and data cataloging tools . Excellent communication and documentation skills.

Posted 1 week ago

Apply

8.0 - 13.0 years

27 - 42 Lacs

Kolkata, Pune, Chennai

Hybrid

Job Description Role requires him/her to design and implement data modeling solutions using relational, dimensional, and NoSQL databases. You will work closely with data architects to design bespoke databases using a mixture of conceptual, physical, and logical data models. Job title Data Modeler Hybrid role from Location: Bangalore, Chennai, Gurgaon, Pune, Kolkata Interviews: 3 rounds of 30 ~ 45 Minutes video-based Teams interviews Employment Type: Permanent Full Time with Tredence Total Experience 9~13 years Required Skills Data Modeling, Dimensional modeling, ErWIN, Data Management, RDBMS, SQL/NoSQL, ETL What we look for: BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 9~13 years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning, and modeling. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands-on knowledge and experience with tools and techniques for analysis, data manipulation, and presentation (e.g. PL/SQL, PySpark, Hive, Impala, and other scripting tools). Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence). Expertise in conceptual modeling; ability to see the big picture and envision possible solutions. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. We're committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events, yoga challenges, or marathons. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

7.0 - 12.0 years

12 - 22 Lacs

Chennai

Remote

About Company: Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. Job Description: We are looking for an experienced Senior Data Modeler to join our agile team and support enterprise-level data initiatives. The ideal candidate will have a strong background in cloud-based data modeling , preferably within the Azure ecosystem , and be able to design and implement robust data models that support scalable and efficient data pipelines. Responsibilities: Design and implement conceptual, logical, and physical data models based on business needs. Work on Azure cloud technologies including Azure Data Lake , Azure Data Factory , and Dremio for data virtualization. Create and maintain Low-Level Design (LLD) documents, Unit Test Plans , and related documentation. Collaborate with data engineers, developers, and analysts to ensure accurate and scalable data modeling. Optimize data-related processes and adhere to coding and modeling standards. Conduct integration testing and support bug fixing throughout the SDLC. Participate in SCRUM ceremonies and work closely with onshore and offshore teams . Manage timelines, deliverables, and communicate blockers or tradeoffs proactively. Assist with documentation required for OIS clearance and compliance audits. Required Skills & Qualifications: Bachelors in Computer Science or related field. 8+ years of professional experience in data modeling (logical & physical). Strong expertise in SQL and experience with relational databases and data warehouses . Hands-on experience with data modeling tools like Erwin or equivalent. 5+ years working with Azure Data Lake , Azure Data Factory , and Dremio . Solid understanding of data structures , indexing , and optimization techniques . Performance tuning skills for models and queries in large datasets. Strong communication skills, both verbal and written. Highly organized, collaborative, and proactive team player. Benefits & Perks: Opportunity to work with leading global clients Flexible work arrangements with remote options Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities

Posted 1 week ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini

Posted 1 week ago

Apply

8.0 - 13.0 years

11 - 21 Lacs

Hyderabad, Pune, Chennai

Hybrid

Role & responsibilities Job Summary: We are looking for a highly skilled Senior Data Modeller with a strong foundation in data modeling concepts who is eager to expand into data engineering . This role is ideal for someone who has deep experience designing conceptual, logical, and physical data models and is looking to evolve into a more hybrid role with modern data engineering capabilities. 815 years of experience in data modelling with a strong understanding of relational and dimensional models. Experience with modeling tools (e.g., Erwin, Power Designer, dbt, SQLDBM, or similar). Proficiency in SQL and strong analytical thinking. Familiarity with metadata management, data catalogs, and lineage tracking tools. Strong communication and stakeholder management skills.

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Hyderabad

Remote

Should possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models. expertise in ER Studio responsible for developing and maintaining data models to support business requirements Required Candidate profile data modeler with over 5+ years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio. Strong analytical, problem-solving, and communication skills must.

Posted 2 weeks ago

Apply

8.0 - 11.0 years

16 - 20 Lacs

Hyderabad

Remote

US Shift(Night Shift) 8+ yrs in Data Modeling, 3+ yrs in ER Studio (ERwin not preferred), strong in relational & dimensional modeling, normalization. HR & EPM experience is a plus. Skilled in metadata, data dictionaries, documentation, communication.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Bengaluru

Remote

Client is 5+years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio.The ideal candidate will possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad

Hybrid

Work Mode: Hybrid(3days WFO & 2 days WFH) Role & responsibilities Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams. Preferred candidate profile Need 15 days or Immediate candidates .

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Gurugram

Remote

Job description Data Modeler AI/ML Enablement Remote | Contract/Freelancer | Duration: 1 to 2 Months Start: Immediate | Experience: 7+ Years Were looking for experienced Data Modelers with a strong background in one or more industries: Telecom, Banking/Finance, Media or Government Only. Key Responsibilities: Design conceptual/logical/physical data models Collaborate with AI/ML teams to structure data for model training Build ontologies, taxonomies, and data schemas Ensure compliance with industry-specific data regulations Must-Have Skills & Experience: 7+ years of hands-on experience in data modeling conceptual, logical, and physical models. Proficiency in data modeling tools like Erwin, ER/Studio, or PowerDesigner. Strong understanding of data domains like customer, transaction, network, media, or case data. Familiarity with AI/ML pipelines understanding how structured data supports model training. Knowledge of data governance, quality, and compliance standards (e.g., GDPR, PCI-DSS). Ability to work independently and deliver models quickly in a short-term contract environment.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Pune

Hybrid

Hi, Greetings! This is regarding a job opportunity for the position of Data Modeller with a US based MNC in Healthcare Domain. This opportunity is under the direct pay roll of US based MNC. Job Location: Pune, Mundhwa Mode of work: Hybrid (3 days work from office) Shift timings: 1pm to 10pm About the Company: The Global MNC is a mission-driven startup transforming the healthcare payer industry. Our secure, cloud-enabled platform empowers health insurers to unlock siloed data, improve patient outcomes, and reduce healthcare costs. Since our founding in 2017, we've raised over $81 million from top-tier VCs and built a thriving SaaS business. Join us in shaping the future of healthcare data. With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. As a result, these payers can ingest and manage all the information they need to transform their business by supporting their analytical, operational, and financial needs through our platform. Since our founding in 2017, it has built a highly successful SaaS business, raising more than $80 Million by leading VC firms with profound expertise in the healthcare and technology industries. We are solving massive complex problems in an industry ready for disruption. We're building powerful momentum and would love for you to be a part of it! Interview process: 5 rounds of interview 4 rounds of Technical Interview 1 round of HR or Fitment discussion Job Description: Data Modeller About the Role: Were seeking a Data Modeler to join our global data modeling team. Youll play a key role in translating business requirements into conceptual and logical data models that support both operational and analytical use cases. This is a high-impact opportunity to work with cutting-edge technologies and contribute to the evolution of healthcare data platforms. What Youll Do Design and build conceptual and logical data models aligned with enterprise architecture and healthcare standards. Perform data profiling and apply data integrity principles using SQL. Collaborate with cross-functional teams to ensure models meet client and business needs. Use tools like Erwin, ER/Studio, DBT, or similar for enterprise data modeling. Maintain metadata, business glossaries, and data dictionaries. Support client implementation teams with data model expertise. What Were Looking For 2+ years of experience in data modeling and cloud-based data engineering. Proficiency in enterprise data modeling tools (Erwin, ER/Studio, DBSchema). Experience with Databricks, Snowflake, and data lakehouse architectures. Strong SQL skills and familiarity with schema evolution and data versioning. Deep understanding of healthcare data domains (Claims, Enrollment, Provider, FHIR, HL7, etc.). Excellent collaboration and communication skills. In case you have query, please feel free to contact me on the below mention email or whatsapp or call. Thanks & Regards, Priyanka Das Email: priyanka.das@dctinc.com Contact Number: 74399 37568

Posted 2 weeks ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Pune

Remote

JOB DESCRIPTION - Modelling Engineers Key Responsibilities Complete Data Modelling Tasks Initiate and manage Gap Analysis and Source-to-Target Mapping Exercises. Gain a comprehensive understanding of the EA extract. Map the SAP source used in EA extracts to the AWS Transform Zone, AWS Conform Zone, and AWS Enrich Zone. Develop a matrix view of all Excel/Tableau reports to identify any missing fields or tables from SAP in the Transform Zone. Engage with SMEs to finalize the Data Model (DM). Obtain email confirmation and approval for the finalized DM. Perform data modelling using ER Studio and STTM. Generate DDL scripts for data engineers to facilitate implementation. Complete Data Engineering Tasks Set up infrastructure for pipelines this includes Glue Jobs, crawlers, scheduling, step functions etc. Build, deploy, test and run pipelines on demand in lower environments. Verify data integrity: no duplicates, all columns in final table etc. Write unit tests for methods used in pipeline and use standard tools for testing. Code formatting and linting. Collaborate with other Modelling Engineers to align on correct approach. Update existing pipelines for CZ tables (SDLF and OF) where necessary with new columns if they are required for EZ tables. Raise DDP requests to register databases and tables, and to load data into the raw zone. Create comprehensive good documentation. Ensure each task is accompanied by detailed notes specific to its functional area for clear tracking and reference. Analyse and manage bugs, and change requests raised by business/SMEs. Collaborate with Data Analyst and Virtual Engineers (VE) to refine and enhance semantic modelling in Power BI. Plan out work using Microsoft Azure, ADO. Dependencies, status and effort is correctly reflected. Required Skills and Experience: Proven experience in data modelling and data pipeline development. Proficiency with tools like ER Studio, STTM, AWS Glue, Redshift & Athena, and Power BI. Strong SQL and experience with generating DDL scripts. Experience working in SAP data environments. Experience in any of these domain areas is highly desirable: Logistics, Supply Planning, Exports and IFOT. Familiarity with cloud platforms, particularly AWS. Hands-on experience with DevOps and Agile methodologies (e.g., Azure ADO). Strong communication and documentation skills. Ability to work collaboratively with cross-functional teams

Posted 2 weeks ago

Apply

7.0 - 12.0 years

19 - 25 Lacs

Bengaluru

Remote

Hi Candidates, we have job openings in one of our MNC Company Interested candidates can apply here and share details to chandrakala.c@i-q.co Note: NP-0-15 days only serving Role & responsibilities Key Responsibilities Complete Data Modelling Tasks o Initiate and manage Gap Analysis and Source-to-Target Mapping Exercises. o Gain a comprehensive understanding of the EA extract. o Map the SAP source used in EA extracts to the AWS Transform Zone, AWS Conform Zone, and AWS Enrich Zone. Develop a matrix view of all Excel/Tableau reports to identify any missing fields or tables from SAP in the Transform Zone. Engage with SMEs to finalize the Data Model (DM). Obtain email confirmation and approval for the finalized DM. o Perform data modelling using ER Studio and STTM. o Generate DDL scripts for data engineers to facilitate implementation. Complete Data Engineering Tasks o Set up infrastructure for pipelines this includes Glue Jobs, crawlers, scheduling, step functions etc. Build, deploy, test and run pipelines on demand in lower environments. Verify data integrity: no duplicates, all columns in final table etc. Write unit tests for methods used in pipeline and use standard tools for testing. o Code formatting and linting. o Collaborate with other Modelling Engineers to align on correct approach. o Update existing pipelines for CZ tables (SDLF and OF) where necessary with new columns if they are required for EZ tables. Raise DDP requests to register databases and tables, and to load data into the raw zone. Create comprehensive good documentation. Ensure each task is accompanied by detailed notes specific to its functional area for clear tracking and reference. Analyse and manage bugs, and change requests raised by business/SMEs. Collaborate with Data Analyst and Virtual Engineers (VE) to refine and enhance semantic modelling in Power BI. Plan out work using Microsoft Azure, ADO. Dependencies, status and effort is correctly reflected. Required Skills and Experience: Proven experience in data modelling and data pipeline development. Proficiency with tools like ER Studio, STTM, AWS Glue, Redshift & Athena, and Power BI. Strong SQL and experience with generating DDL scripts. Experience working in SAP data environments. Experience in any of these domain areas is highly desirable: Logistics, Supply Planning, Exports and IFOT. Familiarity with cloud platforms, particularly AWS. Hands-on experience with DevOps and Agile methodologies (e.g., Azure ADO). Strong communication and documentation skills. Ability to work collaboratively with cross-functional teams. Preferred candidate profile

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

NTT DATA is looking for a Data Modeller to join their team in Noida, Uttar Pradesh (IN-UP), India. As a Data Modeller, you will be responsible for creating and updating data models, defining information requirements for small to medium size projects, and creating ETL specifications/source to target mapping based on project requirements. You will also generate data definition language (DDL) used to create the database schemas and tables, create optimal database views aligned with business and technical needs, and work with assigned technical teams to ensure correct deployment of DDL. In this role, you will be expected to synchronize models to ensure that database structures match models, conduct business and data analysis, and work independently on projects with guidance from the Project Leader. The ideal candidate should have previous experience working with business and technical teams, compiling business definitions for enterprise data model attributes, and 3-5 years of experience in a high-tech environment in Technical or business Application Analysis or equivalent combination of experience and education. Required skills for this position include excellent written and verbal communication skills, physical data modeling, business and data analysis, technical specification development, data mapping, experience using Power Designer or other Data Modeling tools such as Erwin, advanced SQL expertise, and familiarity with the Linux operating system. If you are a passionate individual who wants to be part of an inclusive, adaptable, and forward-thinking organization, and possess the required skills and qualifications, apply now to be a part of NTT DATA's global team of experts dedicated to helping clients innovate, optimize, and transform for long-term success. Visit us at us.nttdata.com for more information about our services and global presence.,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 40 Lacs

Noida, Hyderabad

Work from Office

Senior Data Modelller About the Role: We are seeking an experienced Senior Data Modeller to join our team. In this role, you will be responsible for designing and standardization of enterprise-wide data models across multiple domains such as Customer, Product, Billing, and Network. The ideal candidate will work closely with cross-functional teams to translate business needs into scalable and governed data structures. You will work closely with customers, and technology partners to deliver data solutions that address complex business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Responsibilities: Design logical and physical data models aligned with enterprise and industry standards Create and maintain data models for Customer, Product, Usage, and Service domains Translate business requirements into normalized and analytical schemas (Star/Snowflake) Define and maintain entity relationships, hierarchy levels (Customer - Account - MSISDN), and attribute lineage Standardize attribute definitions across systems and simplify legacy structures Collaborate with engineering teams to implement models in cloud data platforms (e.g., Databricks) Collaborate with domain stewards to simplify and standardize legacy data structures Work with governance teams to tag attributes for privacy, compliance, and data quality Document metadata, lineage, and maintain version control of data models Support analytics, reporting, and machine learning teams by enabling standardized data access Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Qualifications: Bachelors or masters degree in computer science, Data Science, or a related technical field 7+ years of experience in data modelling roles Hands-on experience building data models and platforms Strong experience with data modeling tools (Erwin,Azure Analysis services, SSAS, dbt, informatica) Hands-on experience with modern cloud data platforms (Databricks, Azure Synapse, Snowflake) Deep understanding of data warehousing concepts and normalized/denormalized models Expertise in SQL, data profiling, schema design, and metadata documentation Familiarity with domain-driven design, data mesh and modular architecture Experience in large-scale transformation or modernization programs Knowledge of regulatory frameworks such as GDPR or data privacy-by-design

Posted 2 weeks ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies