Jobs
Interviews

403 Erwin Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

16 - 20 Lacs

bengaluru

Work from Office

An experienced consulting professional who has an understanding of solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Responsibilities Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects.

Posted 18 hours ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Collibra Specialist at NTT DATA, your main responsibility will be to produce data mapping documents, import various glossaries and CDEs into Collibra, establish lineage from Glossary to CDM to LDM, and configure lineage visualizations, glossary workflows, and governance processes in Collibra. Key Responsibilities: - Produce data mapping documents including Glossary, CDM, LDM - Import Business Glossary, sub-domain glossaries, and CDEs into Collibra - Import mapping documents into Collibra and establish lineage from Glossary, CDM, LDM - Configure lineage visualizations, glossary workflows, and governance processes in Collibra Qualifications Required: - Minimum 5-7 years of experience in data governance/metadata management - At least 3 years of hands-on experience with Collibra implementation (glossary, lineage, workflows, metadata ingestion) - Proficiency in metadata ingestion and mapping automation - Ability to script/transform mapping templates into Collibra-ingestable formats - Knowledge of ERWin/Foundry integration with Collibra - Strong analytical and problem-solving skills to support lineage accuracy Please note that you will be required to be available up to 1:30am IST for shift timings. NTT DATA is a $30 billion trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Collibra Specialist, you will be part of a diverse team of experts across more than 50 countries, working towards helping clients innovate, optimize, and transform for long-term success. NTT DATA is committed to investing in research and development to support organizations and society in confidently moving into the digital future. Visit us at us.nttdata.com.,

Posted 3 days ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

hyderabad

Work from Office

Overview As an Analyst, Data Modeler, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analysing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data Modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Bachelors degree required in Computer Science, Data Management/Analytics/Science, Information Systems, Software Engineering or related Technology Discipline. 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including Dev Ops and Data Ops concepts. Familiarity with business intelligence tools (such as Power BI). Excellent verbal and written communication and collaboration skills.

Posted 3 days ago

Apply

6.0 - 11.0 years

25 - 40 Lacs

hyderabad, chennai, bengaluru

Hybrid

Role & responsibilities 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team

Posted 3 days ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

bengaluru

Work from Office

Experience as a Data Modeler or Database Designer, with specific experience in the life insurance industry, Proficiency in Erwin Data Modeler for designing conceptual, logical, and physical data models, Strong SQL skills and experience working with relational databases (e-g , SQL Server, Oracle, DB2), Solid understanding of data modeling techniques, principles, and best practices, Knowledge of life insurance business processes, products, and regulatory requirements is highly desirable, Show more Show less

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Technical Lead Data Modeler at CitiusTech, you will be part of an Agile team responsible for designing and building healthcare applications, implementing new features, and ensuring adherence to the best coding development standards. Your role will involve analyzing business requirements, designing data models, collaborating with stakeholders, and supporting data governance initiatives. Key Responsibilities: - Analyze business requirements to understand data needs and relationships, and design conceptual data models representing high-level business entities and relationships. - Develop logical data models with detailed attributes, keys, and relationships, and create physical data models optimized for specific database or data lake platforms. - Define and document data definitions, standards, and naming conventions, collaborating with data architects, engineers, and analysts to align models with technical and business needs. - Normalize or denormalize data structures based on performance and use-case requirements. - Map source systems to target models for ETL/ELT development and maintain data dictionaries and metadata repositories. - Ensure data models support data quality, integrity, and consistency, reviewing and validating models with stakeholders and subject matter experts. Qualification Required: - Experience: 5-7 years - Location: Pune, Chennai, Bangalore, Mumbai - Educational Qualifications: Engineering Degree - BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Additional Company Details: CitiusTech is a global IT services, consulting, and business solutions enterprise focused 100% on the healthcare and life sciences industry. With over 8,500 healthcare technology professionals worldwide, CitiusTech enables enterprises to build efficient, effective, and equitable human-first ecosystems. The company is committed to shaping healthcare possibilities and making a positive impact on human lives through innovation, collaboration, and continuous learning. CitiusTech offers a diverse and inclusive work culture that focuses on continuous learning, work-life balance, and creating a fun and transparent environment for its employees. Rated as a Great Place to Work, CitiusTech is dedicated to humanizing healthcare and driving positive change in the industry. For more information about CitiusTech, visit their careers page at https://www.citiustech.com/careers.,

Posted 4 days ago

Apply

8.0 - 10.0 years

7 - 11 Lacs

bengaluru

Work from Office

Role Purpose The purpose of this role is to provide solutions and bridge the gap between technology and business know-how to deliver any client solution Do 1. Bridging the gap between project and support teams through techno-functional expertise For a new business implementation project, drive the end to end process from business requirement management to integration & configuration and production deployment Check the feasibility of the new change requirements and provide optimal solution to the client with clear timelines Provide techno-functional solution support for all the new business implementations while building the entire system from the scratch Support the solutioning team from architectural design, coding, testing and implementation Understand the functional design as well as technical design and architecture to be implemented on the ERP system Customize, extend, modify, localize or integrate to the existing product by virtue of coding, testing & production Implement the business processes, requirements and the underlying ERP technology to translate them into ERP solutions Write code as per the developmental standards to decide upon the implementation methodology Provide product support and maintenance to the clients for a specific ERP solution and resolve the day to day queries/ technical problems which may arise Create and deploy automation tools/ solutions to ensure process optimization and increase in efficiency Sink between technical and functional requirements of the project and provide solutioning/ advise to the client or internal teams accordingly Support on-site manager with the necessary details wrt any change and off-site support 2. Skill upgradation and competency building Clear wipro exams and internal certifications from time to time to upgrade the skills Attend trainings, seminars to sharpen the knowledge in functional/ technical domain Write papers, articles, case studies and publish them on the intranet Mandatory Skills: ERWIN. Experience: 8-10 Years.

Posted 4 days ago

Apply

4.0 - 9.0 years

14 - 18 Lacs

gurugram

Work from Office

Role Purpose The purpose of the role is to design, and architect VLSI and Hardware based products and enable delivery teams to provide exceptional client engagement and satisfaction. Title: Data Science Architect. Location Gurgaon / Noida Machine Learning, Data Science, Model Customization [4+ Years] Exp with performing above on cloud services e.g AWS SageMaker and other tools AI/ Gen AI skills: [1 or 2 years] MCP, RAG pipelines, A2A, Agentic / AI Agents Framework Auto Gen, Lang graph, Lang chain, codeless workflow builders etc Role & Responsibilities Build working POC and prototypes rapidly. Build / integrate AI driven solutions to solve the identified opportunities, challenges. Lead cross functional teams in identifying and prioritizing key business areas in which AI solutions can result benefits. Proposals to executives and business leaders on broad range of technology, strategy and standard, governance for AI. Work on functional design, process design (flow mapping), prototyping, testing, defining support model in collaboration with Engineering and business leaders. Articulate and document the solutions architecture and lessons learned for each exploration and accelerated incubation. Relevant IT Experience: - 10+ years of relevant IT experience in given technology

Posted 4 days ago

Apply

8.0 - 13.0 years

2 - 2 Lacs

hyderabad

Work from Office

SUMMARY Must have: Experience with AWS, SQL, and Data Modeling tools Good to have: Exposure to data warehousing concepts and tools (e.g., Redshift, Snowflake) Minimum Experience: 5+ years Requirements Position Summary Position Title Position Brief Summary No. of Roles Data Modeler The Data Modeler in this role will collaborate with IDP’s Data Engineering and Governance team to design and implement long-term data solutions by analyzing business needs, evaluating and optimizing data systems, creating conceptual and physical data models, establishing best practices, ensuring system compatibility, and working closely with cross-functional teams including product owners, data scientists, engineers, analysts, and architects while also maintaining technical documentation, SOPs, and training materials. 1

Posted 4 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

mumbai

Remote

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 4 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

noida

Work from Office

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 4 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

bengaluru

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 4 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

hyderabad

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 4 days ago

Apply

11.0 - 20.0 years

25 - 40 Lacs

pune, chennai, bengaluru

Work from Office

Responsibilities: Design and develop conceptual, logical, and physical data models for enterprise and application-level databases Translate business requirements into well-structured data models that support analytics, reporting, and operational systems Define and maintain data standards, naming conventions, and metadata for consistency across systems Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships Support data governance initiatives including data lineage, quality, and cataloging Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.

Posted 4 days ago

Apply

7.0 - 12.0 years

0 - 0 Lacs

bengaluru

Work from Office

Experience:- 5- 12 yrs Location -Bengaluru only Notice - period- immediate - 30 days Only PF Is MAndatory from all companies you worked with Mandatory Skills: Dimensional Data Modelling JD: Perform IT analysis investigative data analysis system design and data modelling to support development teams Take lead or facilitating role to drive functional and nonfunctional requirements and analysis work that involves multiple stakeholders to design sustainable and value adding solutions Ensure alignment and transparency of requirements by acting as a bridge between IT and business teams Build a contextual understanding of business processes on one hand and functionaltechnical knowhow of our solutions on the other Buidlign understanding of data model its cosnumption by development Teams and if data fullfils business requirements in our IIS area To succeed in this role we believe that you Have analytical way of thinking Are very good in interpersonal communications and must have good command over English language Have passion for data analysis problem solving and requirements elicitation Have strong business and technical background Are familiar with agile ways of working and willing to work in crossborder crosscultural virtual teams Are proactive structured detail oriented and quality driven Are able and passionate to develop in your role as assignments can vary over time Are cooperative as a team player as well able to work independently

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

tamil nadu

On-site

Role Overview: As a Data Analyst at Standard Chartered, you will be responsible for managing all aspects of new work/project assessments from inception to delivery. Your main tasks will include building close working relationships with the chief data office to design and implement specific data requirements based on the bank's data standards. Additionally, you will collaborate with project teams, partners, and third-party vendors. Key Responsibilities: - Work closely with business users to understand their data analysis needs and requirements. Build specifications that define the business value for data analysis, approach for extracting data from Golden Sources, and business rules/logic. - Align with the Bank's Data Quality Management Framework to ensure data quality controls, governance, and compliance are in line with the Banks Data Strategy and Architecture. - Develop use cases for data analytics applications to meet various business needs. Build data models/scenarios to showcase potential insights to the business. - Partner with system and data owners to document Data Standards of individual critical data attributes. - Perform data quality controls assessment, identify gaps, and follow up with data owners for remediation. - Conduct standard/advanced profiling of data attributes to assess data quality. - Perform gap analysis against the established DQMF framework & guidelines to evaluate levels of adherence. - Support the transition of data quality and data governance capabilities into Business as Usual (BAU). - Develop a standard set of analytic tools to enable businesses to perform data analytics. - Provide data readiness reviews before implementation and manage any arising issues related to Data Visualization. This role requires strong data storytelling skills using data visualization tools like Microstrategy, Tableau, etc. Qualifications Required: - Strong query language skills including SQL, Hive, HBase, ETL (Dataiku). - Proficiency in Business Intelligence tools and Decision Support Systems. - Solid data analysis skills using Hive, Spark, Python, R, Microstrategy, and Tableau. - Experience in working with key stakeholders within the business. - Proven problem-solving skills and experience in Data Management and Data Quality Management techniques. - Stakeholder Management and Analysis abilities. - Presentation Skills for data storytelling using visualizations. - Soft Skills including Communication, Negotiation, Relationship Building, and Influencing. About Standard Chartered: Standard Chartered is an international bank known for its positive impact on clients, communities, and employees for over 170 years. The bank values diversity, challenges the status quo, and strives for continuous improvement. If you are seeking a purpose-driven career in a bank that makes a difference, Standard Chartered welcomes you to join their team. The bank's core purpose is to drive commerce and prosperity through unique diversity, advocating for inclusion and embracing differences across teams and geographies. What We Offer: - Core bank funding for retirement savings, medical and life insurance. - Flexible working options and patterns. - Proactive well-being support and continuous learning opportunities. - Inclusive and values-driven organizational culture celebrating diversity and respecting individual potential.,

Posted 5 days ago

Apply

6.0 - 9.0 years

12 - 16 Lacs

bengaluru

Work from Office

About The Role Project Role : Architecture Assessment Lead Project Role Description : Leads the execution of architecture assessments for all relevant aspects (ex., infrastructure, platform, application, data, and process). Defines the assessment scope and gains client agreement. Leads and describes the discovery assessment and provides the recommendation to address weaknesses and opportunities. Must have skills : DevOps Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Architecture Assessment Lead, you will be responsible for leading the execution of architecture assessments across various domains, including infrastructure, platform, application, data, and process. Your typical day will involve defining the assessment scope, engaging with clients to gain agreement, conducting discovery assessments, and providing insightful recommendations to address identified weaknesses and opportunities within the architecture. You will collaborate with diverse teams to ensure a comprehensive evaluation and foster a culture of continuous improvement in architectural practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and discussions to gather insights and feedback from stakeholders.- Develop and maintain documentation related to assessment findings and recommendations. Professional & Technical Skills: - Must To Have Skills: Proficiency in DevOps.- Strong understanding of cloud computing platforms and services.- Experience with continuous integration and continuous deployment practices.- Familiarity with containerization technologies such as Docker and Kubernetes.- Knowledge of infrastructure as code tools like Terraform or Ansible. Additional Information:- The candidate should have minimum 7.5 years of experience in DevOps.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

7.0 - 12.0 years

0 - 2 Lacs

bengaluru

Remote

Job Role: Senior Data Engineer - IRIS Duration: 6 months contract Timing: Night shift (6pm to 3am) Key Skills Required: Programming: Python, SQL, Spark Cloud Platforms: Azure, Snowflake Data Tools: DBT, Erwin Data Modeler, Apache Airflow, API Integrations, ADF Governance: Data masking, metadata management, SOX compliance Soft Skills: Communication, problem-solving, stakeholder engagement As IRIS Data Engineer, you will work with Data Scientists and Data Architects to translate prototypes into scalable solutions. Key Responsibilities: 1. Data Pipeline Design & Development Data Engineers are responsible for designing and building robust, scalable, and high-quality data pipelines that support analytics and reporting needs. This includes: Integration of structured and unstructured data from various sources into data lakes and warehouses. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using Azure Data Factory, Databricks, Snowflake and Azure SQL Server, control -M. Collaborate with data scientists, analysts, and platform engineers to enable analytics and ML use cases. Design, develop, and optimise DBT models to support scalable data transformations. 2. Cloud Platform Engineering They operationalize data solutions on cloud platforms, integrating services like Azure, Snowflake, and third-party technologies. Manage environments, performance tuning, and configuration for cloud-native data solutions. 3. Data Modeling & Architecture Apply dimensional modeling, star schemas, and data warehousing techniques to support business intelligence and machine learning workflows. Collaborate with solution architects and analysts to ensure models meet business needs. 4. Data Governance & Security Ensure data integrity, privacy, and compliance through governance practices and secure schema design. Implement data masking, access controls, and metadata management for sensitive datasets. 5. Collaboration & Agile Delivery Work closely with cross-functional teams including product owners, architects, and business stakeholders to translate requirements into technical solutions. Participate in Agile ceremonies, sprint planning, and DevOps practices for continuous integration and deployment. Technical Skills: 7+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting. 5+ years of experience in ML Lifecycle using Azure Kubernetes service, Azure Container Instance service, Azure Data Factory, Azure Monitor, Azure DataBricks building datasets, ML pipelines, experiments, logging, and monitoring. (Including Drifting, Model Adaptation and Data Collection). 5+ years of experience in data engineering using Snowflake. Experience in designing, developing & scaling complex data & feature pipelines feeding ML models and evaluating their performance. Experience in building and managing streaming and batch inferencing. Proficiency in SQL and any one other programming language (e.g., R, Python, C++, Minitab, SAS, Matlab, VBA knowledge of optimization engines such as CPLEX or Gurobi is a plus). Strong experience with cloud platforms (AWS, Azure, etc.) and containerization technologies (Docker, Kubernetes). Experience with CI/CD tools such as GitHub Actions, GitLab, Jenkins, or similar tools. Familiarity with security best practices in DevOps and ML Ops. Experience in developing and maintaining APIs (e.g.: REST) Agile/Scrum operating experience using Azure DevOps. Experience with MS Cloud - ML Azure Databricks, Data Factory, Synapse, among others. Professional Skills: Strong analytical and problem-solving skills and passion for product development. Strong understanding of Agile methodologies and open to working in agile environments with multiple stakeholders. Professional attitude and service orientation; team player. Ability to translate business needs into potential analytics solutions. Strong work ethic: ability to work at an abstract level and gain consensus. Ability to build a sense of trust and rapport to create a comfortable and effective workplace.

Posted 5 days ago

Apply

12.0 - 18.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Services & Integration Manager, you will be responsible for overseeing the technical delivery of data and relevant infrastructure, with a strong understanding of systems such as SAP, iBPM, Oracle, and Informatica. Your expertise in technology, frameworks, and accelerators like ERWIN, Sparks, Zachman, and industry data models will be essential. You should possess knowledge of Azure or AWS, and experience in catalogue & metadata management, data ownership, stewardship, and governance. Your main responsibilities will include developing a corporate data model to ensure data is treated as a reusable asset, driving consistency in data model, ownership, definition, and structure, and ensuring data connectivity across all layers. You will actively engage with data change projects to create a corporate view of data and collaborate with senior stakeholders to understand business requirements and drive execution. To qualify for this role, you should have 12 to 18 years of experience, a Bachelor's Degree in Accounting, Finance, Business, or a relevant data modelling certification. Additionally, possessing an architecture certification such as TOGAF would be beneficial. If you are passionate about data integration and have a strong background in data services, this role in Pune offers you the opportunity to make a significant impact. For more information on this exciting opportunity, please contact 85916 09735 or email priyanshu@mmcindia.biz.,

Posted 6 days ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

bengaluru

Hybrid

Kenvue is currently recruiting for: Data Modeler/Staff Data Engineer This position reports into Sr Manager / Director and is based at Bengaluru. What You Will Do The Data Modeler is responsible to work closely with the Product Management group, internal stakeholders and the Engineering group:• Data Modelling: Creating data architectures and logical/physical data models for databases and data warehouses• Forge trusted partnerships and be an integral part of Data Engineering, Data Architecture & Platforms and Data Science teams to adopt, scale and build data products •Data Warehouse Development & Administration: Designing, developing, implementing a data warehouse to integrate data from various sources/systems within an organization •Developing strategies for data acquisition, archive recovery, and database implementation •Work closely with the Business Analytics leaders to understand needs of the business; clearly articulating the story of value created through data & technology• Drive prioritization and implementation of most appropriate combination of data engineering methodologies & frameworks to ensure optimal scalability, flexibility and performance of platforms, products & solutions• Be an expert and a thought leader of how data and technology come together to empower the business •Conducting requirements gathering and analysis to understand the domain of the software problem and/or functionality, the interfaces between hardware and software, and the overall software characteristics Key Responsibilities Perform enterprise level and project level data modelling, including model management, consolidation and integration Understand business requirements and translate to Conceptual (CDM), Logical (LDM) and Physical (PDM) data model by using industry standards Managing Data Model for multiple projects and make sure data model in all projects are synchronized and adhering to Enterprise Architecture with proper change management Establish and manage existing standards for naming and abbreviation conventions, data definitions, ownership, documentation, procedures, and techniques Adopt, support, and participate in the implementation of the Enterprise Data Management Strategy Experience in Medallion (Lakehouse) Architecture. Ensure reusability of model and approach in across different business requirements Support data specific system integration and support data migration What We Are Looking For Required Qualifications Typically requires a minimum of 12 years of related experiences with a Bachelor's degree in Computer or Software Engineering; or 10 years and a Master degree in Computer or Software Engineering; Demonstrated strength in examining issues, driving resolution & influencing effectively across the organization. Ability to challenge the status quo in technology & architecture. Superb interpersonal & communication skills (oral and written), including the ability to explain digital concepts and technologies to business leaders, as well as business concepts to technologists. 3+ years leading data engineering teams in Consumer/Healthcare Goods companies and excellent understanding of business domains within the industry 5-8 years of progressive experience with Data Modeling Minimum of 3 years hands-on experience in Cloud Architecture (Azure, GCP, AWS) & cloud-based databases (Synapse, Databricks, Snowflake, Redshift) and various data integration techniques (API, stream, file) using DBT, SQL/PySpark, Python. Should have experience in ER Studio/Erwin data Modeler, Power BI/ Tableau or Microstrategy or equivalent tool. Should be strong in Data warehousing concepts Should have strong database development skills like complex SQL queries, complex store procedures Desired Qualifications Conversant with Data & Analytics product management, Azure, SQL, Data Catalogue Experience with unstructured data processing and real-time data pipelines Preferably from Retail or CPG industry Whats In It For You Competitive Benefit Package Paid Company Holidays, Paid Vacation, Volunteer Time, Summer Fridays & More! Learning & Development Opportunities Employee Resource Groups Kenvue is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identify, age, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

Posted 6 days ago

Apply

5.0 - 9.0 years

20 - 25 Lacs

noida, chennai, bengaluru

Hybrid

Mandatory Skills and Qualifications Proven experience as a Data Modeler or in a similar role. • Strong understanding of data modeling principles and methodologies. • Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, MySQL). • Familiarity with data modeling tools (e.g., ER/Studio, ERwin, PowerDesigner). • Experience with NoSQL databases (e.g., MongoDB, Cassandra) and their data modeling techniques. • Excellent analytical and problem-solving skills. • Strong communication and collaboration skills.

Posted 6 days ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

noida

Work from Office

Data Modeling Azure , SAP "Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail."

Posted 6 days ago

Apply

4.0 - 7.0 years

3 - 8 Lacs

bengaluru

Work from Office

Job Title: Techno-Functional Consultant / Data Modeler BI Location: Titan Corporate limited Experience: 4+ years Job Description We are looking for a Techno-Functional Consultant / Data Modeler BI to design and build scalable data models, manage enterprise-wide data layers, and deliver high-performance Data Marts. The role involves working with large-scale business data, cross-functional teams, and modern BI/ETL tools to enable data-driven decision-making across Titan businesses. Key Responsibilities Architect layered data models (LSA / Medallion) ensuring scalability, quality & performance. Design, optimize, and maintain data dictionaries, glossaries, and ERDs. Translate business requirements into technical specifications, KPIs & metrics. Lead and manage data projects in an Agile framework. Collaborate with BI consultants, analysts, IT, and governance teams. Define testing strategy, ensure data quality & integrity. Enable self-service BI and reduce dashboard/report development timelines. Required Skills & Experience 4+ years experience in Data Modeling, BI & Data Warehousing. Strong knowledge of SQL (Advanced queries, stored procedures, tuning), RDBMS. Hands-on experience with ETL tools (SAP Data Services / IBM DataStage / AWS Glue). Expertise in ER Modeling (Conceptual, Logical, Physical) using tools like ERWin/ER Studio. Experience with Reporting Tools : Tableau / Power BI / AWS Quicksight. Domain expertise in Retail / Manufacturing / HR / Finance. Exposure to Salesforce Customer Data Cloud (preferred). Good to have: Predictive / Statistical Data Modelling. Education Post-Graduation / Engineering Interested candidates kindly share your details on amruthaj@titan.co.in

Posted 6 days ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

hyderabad, chennai, bengaluru

Hybrid

Are you ready to make a difference in Data Space? Looking for immediate joiners - only candidates available to join in September 2025 are eligible to apply. Job Title: Data Modeller & Architect Location: Bengaluru, Chennai, Hyderabad What do we expect? 6-12 years of experience in Data Modelling. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team Contact Amirtha (HR - Aram Hiring) - WhatsApp your resume to 8122080023 / amirtha@aramhiring.com Who is our client: Our Client is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. They offer full stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. They are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Their purpose is to provide certainty to shape a better tomorrow.Our client operates with 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of their team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence.We are a Great Place to Work-Certified (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. Our client have been ranked among the Best and Fastest Growing analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like? As an Engineer and Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights.On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs External Skills And Expertise You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry. Kindly share your resume to amirtha@aramhiring.com / 8122080023

Posted 6 days ago

Apply

12.0 - 16.0 years

35 - 45 Lacs

hyderabad, chennai, bengaluru

Hybrid

Role - Data Architect - Data Modeling Exp - 12-16 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent FTE Client - Data Analytics Global Leader Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies