Jobs
Interviews

233 Erwin Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

7 - 12 Lacs

Bengaluru

Work from Office

As a Senior zOS System Programmer /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise 10+ years of Software development experience with zOS or zOS Sub-systems. * 8+ years Professional experience developing with Golang, Python and Ruby * Hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyse issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback. Preferred technical and professional experience

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Delhi / NCR

Work from Office

About the Company : Headquartered in California, U.S.A., GSPANN provides consulting and IT services to global clients. We help clients transform how they deliver business value by helping them optimize their IT capabilities, practices, and operations with our experience in retail, high-technology, and manufacturing. With five global delivery centers and 1900+ employees, we provide the intimacy of a boutique consultancy with the capabilities of a large IT services firm. Role: Data Modeler. Experience: 7+ Years Skill Set: Data Modelling and SQL. Location: Pune, Hyderabad, Gurgaon Position in brief: We are more looking into technical with a piece of functional knowledge. at least 5 years of hands-on data modeling (conceptual, logical, and physical), data profiling, and data analysis skills SQL It should be Basic to intermediate level/ added advantage someone good with writing complex SQL queries ETL - should have an idea of how ETL process works/ should provide any ETL attributes and partition-related info as part of the data mapping document. Any tool experience is okay- ER Studio, ERWin, Sybase Power Designer. Detailed Job Description We are looking for a passionate Data Analyst/Data Modeler to build, optimize and maintain conceptual and logical/Physical database models. The Candidate will turn data into information, information into insight, and insight into business decisions. Responsibilities: Be responsible for gathering requirements from the business team and translating to technical requirements. Should be able to drive the projects and provide guidance wherever needed. Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. The candidate must be able to work independently and collaboratively. Work with management to prioritize business and information needs. Requirements: Bachelors or masters degree in computer/data science technical or related experience. 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL/Big data platform technologies, and ETL and data ingestion protocols). Proven working experience as a data analyst/data modeler or a similar role. Technical expertise in designing data models, database design and data analysis. Prior experience involving in the migration of data from legacy systems to new solutions. Good knowledge of metadata management, data modelling, and related tools (Erwin or ER Studio or others) is required. Experience gathering and analysing system/business requirements and providing mapping documents for technical teams. Strong analytical skills with the ability to collect, organize, analyze, and disseminate. significant amounts of information with attention to detail and accuracy Hands-on experience with SQL Problem-solving attitude

Posted 3 weeks ago

Apply

6.0 - 8.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Role & responsibilities Mandate skills- Python, AWS, Data Modeler, SQL and Devops (Good to Have not Mandate) Please avoid candidates Qualified from the Universities of Hyderabad & Telangana You Can consider Hyderabad and Telangana Candidates those are working in only tier one companies Job Description- The ideal candidate will have 6 to 8 years of experience in data modelling and architecture with deep expertise in Python, AWS cloud stack , data warehousing , and enterprise data modelling tools . This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units. This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modelling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future ready. Key Responsibilities: Design and deliver conceptual, logical, and physical data models using tools like ERWin . Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts. Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.). Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic. Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift . Ensure data governance, lineage, and quality mechanisms are established across systems. Lead and mentor technical teams in an Agile project delivery model. Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping. Identify data gaps and availability issues with respect to source systems. Required Skills & Qualifications: Bachelors or Masters degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA) . Minimum 8 years of experience in data modeling and architecture. Proficiency with data modeling tools such as ERWin , with strong knowledge of forward and reverse engineering . Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning). Strong experience in Python, data warehousing , RDBMS , and ETL tools like AWS Glue , IBM DataStage , or SAP Data Services . Hands-on experience with AWS services : Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q. Good understanding of reporting tools such as Tableau , Power BI , or AWS QuickSight . Exposure to DevOps/CI-CD pipelines , AI/ML , Gen AI , NLP , and polyglot programming is a plus. Familiarity with data governance tools (e.g., ORION/EIIG). Domain knowledge in Retail , Manufacturing , HR , or Finance preferred. Excellent written and verbal communication skills. Certifications (Preferred) Good to have AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics Specialty ) Data Governance or Data Modelling Certifications (e.g., CDMP , Databricks , or TOGAF ) Mandatory Skills Python, AWS, Technical Architecture, AIML, SQL, Data Warehousing, Data Modelling Preferred candidate profile Share resumes on Sakunthalaa@valorcrest.in

Posted 3 weeks ago

Apply

8.0 - 12.0 years

37 - 45 Lacs

Noida, Hyderabad

Work from Office

Position Summary MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science. Role Value Proposition The Business analyst data modeler has an important role in the data and analytics (D&A) organization. The role ensures, data is structured, organized and is represented effectively aligned to the needs of the organization. The role helps design logical & physical models which include implementation of robust data models that accurately capture, store, and manage data end to end. Job Responsibilities Perform data modeling activity (Logical, Physical) using data modeling tool CA Erwin Data Modeler Ability to gather, understand & analyze business requirements accurately Ability to analyze data using SQL Partnering with other teams to understand data needs & translating it into effective data models. Ability to collaborate with stakeholders to provide domain-based solutions. Experience in implementing industry data modeling standards, best practices, and emerging technologies in data modeling Hands-on experience with API development and integration (REST, SOAP, JSON, XML). Education, Technical Skills & Other Critical Requirement 1. 8 - 10 years of overall experience with minimum 5+ years in the field of Data modeling, business & functional requirements gathering, designing data strategies and data flows, data/ETL/BI architecture, delivering logical/physical data models in collaboration with business/application teams and BSAs 2. 3+ years Hands on experience in CA Erwin 3. 5+ years of experience in SQL, data modelling and data analysis 4. Ability to communicate effectively 5. Strong SQL skills 6 Hands-on experience with ERWIN tool 7. Familiarity with Agile best practices 8. Strong collaboration and facilitation skills

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Analytics Good to have skills : Microsoft SQL Server, Python (Programming Language), AWS RedshiftMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Senior Analyst, Data Engineering, you will be part of the Data and Analytics team, responsible for developing and delivering high-quality data assets and managing data domains for Personal Banking customers and colleagues. You will bring expertise in data handling, curation, and conformity, and support the design and development of data solutions that drive business value. You will work in an agile environment to build scalable and reliable data pipelines and platforms within a complex enterprise. Roles & Responsibilities:Hands-on development experience in Data Warehousing and/or Software Development.Utilize tools and best practices to build, verify, and deploy data solutions efficiently.Perform data integration and sourcing activities across various platforms.Develop data assets to support optimized analysis for customer and regulatory outcomes.Provide ongoing support for data platforms, including problem and incident management.Collaborate in Agile software development environments using tools like GitHub, Confluence, and Rally.Support continuous improvement and innovation in data engineering practices. Professional & Technical Skills: Must To Have Skills: Experience with cloud technologies, especially AWS (S3, Redshift, Airflow).Proficiency in DevOps and DataOps tools such as Jenkins, Git, and Erwin.Advanced skills in SQL and Python.Working knowledge of UNIX, Spark, and Databricks. Additional Information:Position:Senior Analyst, Data EngineeringReports to:Manager, Data EngineeringDivision:Personal BankGroup:3Industry/Domain Skills: Experience in Retail Banking, Business Banking, or Wealth Management preferred Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : Microsoft SQL Server, Python (Programming Language), Snowflake Data WarehouseMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Senior Analyst, Data Engineering, you will be part of the Data and Analytics team, responsible for developing and delivering high-quality data assets and managing data domains for Personal Banking customers and colleagues. You will bring expertise in data handling, curation, and conformity, and support the design and development of data solutions that drive business value. You will work in an agile environment to build scalable and reliable data pipelines and platforms within a complex enterprise. Roles & Responsibilities:Hands-on development experience in Data Warehousing and/or Software Development.Utilize tools and best practices to build, verify, and deploy data solutions efficiently.Perform data integration and sourcing activities across various platforms.Develop data assets to support optimized analysis for customer and regulatory outcomes.Provide ongoing support for data platforms, including problem and incident management.Collaborate in Agile software development environments using tools like GitHub, Confluence, and Rally.Support continuous improvement and innovation in data engineering practices. Professional & Technical Skills: Must To Have Skills: Experience with cloud technologies, especially AWS (S3, Redshift, Airflow).Proficiency in DevOps and DataOps tools such as Jenkins, Git, and Erwin.Advanced skills in SQL and Python.Working knowledge of UNIX, Spark, and Databricks. Additional Information:Position:Senior Analyst, Data EngineeringReports to:Manager, Data EngineeringDivision:Personal BankGroup:3Industry/Domain Skills: Experience in Retail Banking, Business Banking, or Wealth Management preferred Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Visualization Good to have skills : Microsoft SQL Server, SAS BI, Microsoft Power Business Intelligence (BI)Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Senior Analyst, Data Engineering, you will be part of the Data and Analytics team, responsible for developing and delivering high-quality data assets and managing data domains for Personal Banking customers and colleagues. You will bring expertise in data handling, curation, and conformity, and support the design and development of data solutions that drive business value. You will work in an agile environment to build scalable and reliable data pipelines and platforms within a complex enterprise. Roles & Responsibilities:Hands-on development experience in Data Warehousing and/or Software Development.Utilize tools and best practices to build, verify, and deploy data solutions efficiently.Perform data integration and sourcing activities across various platforms.Develop data assets to support optimized analysis for customer and regulatory outcomes.Provide ongoing support for data platforms, including problem and incident management.Collaborate in Agile software development environments using tools like GitHub, Confluence, and Rally.Support continuous improvement and innovation in data engineering practices. Professional & Technical Skills: Must To Have Skills: Experience with cloud technologies, especially AWS (S3, Redshift, Airflow).Proficiency in DevOps and DataOps tools such as Jenkins, Git, and Erwin.Advanced skills in SQL and Python.Working knowledge of UNIX, Spark, and Databricks. Additional Information:Position:Senior Analyst, Data EngineeringReports to:Manager, Data EngineeringDivision:Personal BankGroup:3Industry/Domain Skills: Experience in Retail Banking, Business Banking, or Wealth Management preferred Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Analytics Good to have skills : Microsoft SQL Server, Python (Programming Language), AWS RedshiftMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary The purpose of the Data Engineering function within the Data and Analytics team is to develop and deliver great data assets and data domain management for our Personal Banking customers and colleagues seamlessly and reliably every time.As a Senior Data Engineer, you will bring expertise on data handling, curation and conformity capabilities to the team; support the design and development of solutions which assist analysis of data to drive tangible business benefit; and assist colleagues in developing solutions that will enable the capture and curation of data for analysis, analytical and/or reporting purposes. The Senior Data Engineer must be experience working as part of an agile team to develop a solution in a complex enterprise. Roles & ResponsibilitiesHands on development experience in Data Warehousing, and or Software DevelopmentExperience utilising tools and practices to build, verify and deploy solutions in the most efficient waysExperience in Data Integration and Data Sourcing activitiesExperience developing data assets to support optimised analysis for customer and regulatory outcomes.Provide ongoing support for platforms as required e.g. problem and incident managementExperience in Agile software development including Github, Confluence, Rally Professional & Technical SkillsExperience with cloud technologies, especially AWS (S3, Redshift, Airflow), DevOps and DataOps tools (Jenkins, Git, Erwin)Advanced SQL and Python userKnowledge of UNIX, Spark and Databricks Additional InformationPosition:Senior Analyst, Data EngineeringReports to:Manager, Data EngineeringDivision:Personal BankGroup:3Industry/domain skills:Some expertise in Retail Banking, Business Banking and or Wealth Management preferred Qualification 15 years full time education

Posted 3 weeks ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Job Summary: We are seeking an experienced and detail-oriented Lead Data Modeler to design, implement, and manage robust data models that support our enterprise data warehouse and analytics ecosystem. The ideal candidate will have deep expertise in data analysis, dimensional modeling, and database design, with strong hands-on experience using tools like ERwin to create scalable, optimized, and maintainable data structures. Key Responsibilities: Analyze business requirements and translate them into logical and physical data models. Design and develop enterprise-level conceptual, logical, and physical data models for OLTP and OLAP systems. Build and maintain dimensional models (star/snowflake schemas) to support business intelligence and reporting needs. Lead the data modeling and data architecture efforts for large-scale data warehouse and data integration projects. Define data standards, naming conventions, metadata, and data lineage documentation. Work closely with business analysts, data engineers, and application developers to ensure alignment of data structures with business needs. Collaborate with DBAs and ETL developers to implement models in physical databases. Use ERwin (or similar data modeling tools) to create and manage models and reverse-engineer existing structures. Ensure data models are optimized for performance, scalability, and data integrity. Participate in data governance initiatives and data quality improvement projects. Required Skills & Qualifications: 8-12 years of experience in data modeling, data architecture, and database design. Strong experience with dimensional modeling, data warehouse architecture, and data mart development. Strong Proficiency in ERwin Data Modeler (or equivalent tools such as PowerDesigner). Solid understanding of relational databases (e.g., SQL Server, Oracle, PostgreSQL) and data warehouse platforms. Strong SQL skills and ability to analyze and understand complex data sets. Experience with data integration, ETL, and data governance principles. Ability to manage multiple priorities in a fast-paced environment and work collaboratively across teams. Strong communication and documentation skills. Preferred Qualifications: Experience in cloud-based data platforms (e.g., Azure Synapse, Snowflake, AWS Redshift). Familiarity with Data Vault modeling or NoSQL modeling is a plus. Experience in Agile/Scrum environments and using version control tools (e.g., Git)

Posted 3 weeks ago

Apply

5.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered

Posted 4 weeks ago

Apply

0.0 years

4 - 9 Lacs

Bengaluru

Work from Office

- Grade Specific Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis DatabaseSQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIXShell Scripting (must), Unix utilities like sed, awk, perl, python Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP) Project ProfilesAtleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly Expected to produce LLD, work with testers, work with PMO and develop graphs, schedules, 3rd level support Should have hands on development experience with various Ab Initio components such as Rollup Scan, join Partition, by key Partition, by Round Robin. Gather, Merge, Interleave Lookup etc Experience in finance and ideally capital markets products. Requires experience in development and support of complex frameworks to handle multiple data ingestion patterns.e.g, messaging files,hierarchical polymorphic xml structures conformance of data to a canonical model curation and distribution of data QA Resource. Data modeling experience creating CDMs LDMs PDMs using tools like ERWIN, Power designer or MagicDraw. Detailed knowledge of the capital markets including derivatives products IRS CDS Options structured products and Fixed Income products. Knowledge on Jenkins and CICD concepts. Knowledge on scheduling tool like Autosys and Control Center. Demonstrated understanding of how AbInitio applications and systems interact with the underlying hardware ecosystem. Experience working in an agile project development lifecycle. Strong in depth knowledge of databases and database concepts DB2 knowledge is a plus Primary Skills: Abinitio Graphs Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. Skills (competencies) Verbal Communication Ab Initio SQL Teradata Shell Script Jenkins Database Platform Oracle

Posted 1 month ago

Apply

5.0 - 9.0 years

6 - 10 Lacs

Pune

Work from Office

Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis DatabaseSQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIXShell Scripting (must), Unix utilities like sed, awk, perl, python Scheduling knowledge (Control M, Autosys, Maestro, TWS, ESP) Project ProfilesAtleast 2-3 Source Systems, Multiple Targets, simple business transformations with daily, monthly Expected to produce LLD, work with testers, work with PMO and develop graphs, schedules, 3rd level support Should have hands on development experience with various Ab Initio components such as Rollup Scan, join Partition, by key Partition, by Round Robin. Gather, Merge, Interleave Lookup etc Experience in finance and ideally capital markets products. Requires experience in development and support of complex frameworks to handle multiple data ingestion patterns.e.g, messaging files,hierarchical polymorphic xml structures conformance of data to a canonical model curation and distribution of data QA Resource. Data modeling experience creating CDMs LDMs PDMs using tools like ERWIN, Power designer or MagicDraw. Detailed knowledge of the capital markets including derivatives products IRS CDS Options structured products and Fixed Income products. Knowledge on Jenkins and CICD concepts. Knowledge on scheduling tool like Autosys and Control Center. Demonstrated understanding of how AbInitio applications and systems interact with the underlying hardware ecosystem. Experience working in an agile project development lifecycle. Strong in depth knowledge of databases and database concepts DB2 knowledge is a plus Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders. - Grade Specific Primary Skills: Abinitio Graphs Skills (competencies) Verbal Communication

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 20 Lacs

Pune, Bengaluru

Hybrid

Knowledge/Experience Proven knowledge of physical and logical data modelling in a data warehouse environment including the successful creation of conformed dimensional models from a range of legacy source systems alongside modern SaaS/Cloud business applications Experience within a similar role within insurance (ideally health insurance) or similar complex and regulated industry, and able to demonstrate a sound working business knowledge of its operation. Experienced at capturing technical and business metadata including being able to elicit and create sound definitions for entities and attribute. Practiced and able to query data from source or raw data and reverse engineer an underlying data model and data definitions. Experienced in writing scripts for data transformation using SQL, DDL, DML, and Pyspark. Good knowledge and exposure to software development lifecycles and good engineering practices Can demonstrate a good working knowledge of data modelling patterns and when to use them. Technical skills Entity relationship, dimensional, and NOSQL modelling as appropriate to data warehousing, business intelligence, and analytical approaches using IE or other common notations. SQL, DDL, DML, and Pyspark scripting ERWIN, and Visio data modelling/UML tool Ideally, Azure Data Factory, Azure Dev Ops, and Databricks

Posted 1 month ago

Apply

8.0 - 13.0 years

17 - 30 Lacs

Hyderabad

Hybrid

SUMMARY OF RESPONSIBILITIES Here at MetLife Australia, we are seeking an experienced Data Modeler/Solution Designer to join our team and help on our building data platform solution (known as the Data Exchange or DAX). This role will mainly help design, build data models, and oversee implementation of data pipelines on our Snowflake data platform(DAX) . You will work with Snowflake, Erwin and Matillion to develop scalable and efficient data pipelines that support our business needs. A strong understanding of data modeling principles is essential to build robust data pipelines based on our Data Architecture. Additionally, familiarity with DevOps models, CI/CD workflows and best practices is important. KEY RESPONSIBILITIES DATA SOLUTION DESIGN Core Deliverables: Develop and deliver Conceptual/Logical/Physical data models Develop data mappings and transformation rules document Develop business glossaries (metadata) and relevant data model documentation (data dictionaries) for data and analytics solutions. Responsibilities: Work with business SMEs and data stewards to align data models according to business process. Ensure data architecture and data management principles are followed during development of new data models. Build data modelling standards, naming conventions and follow them during implementation. Work with data stewards, data management specialists to ensure metadata, data quality rules are implemented accurately. Integrate data from various source systems (e.g., databases, APIs, flat files) into Snowflake. Review and support the whole SDLC till go-live and post-production issues. This includes reviewing technical mapping documents, test cases, test plans and execution. Includes scheduling seamless deployment to go live. TECHNICAL TROUBLESHOOTING AND OPTIMISATION Ability to conduct workshops with business SMEs and data stewards to understand the business processes. Data Profiling and analysis to establish strong hold on data. Conduct root cause analysis and recommend long-term fixes for recurring issues Conduct impact assessment for any upstream / downstream changes. DevOps and Operations Collaborate with Data Engineers and Tech BA, ensuring proper version control (Git, Bitbucket) and deployment best practices for data models. Ensure compliance with data governance, security. QUALIFICATIONS Bachelor’s degree in computer science or equivalent with demonstrated experience in delivery of data model and design for Data and analytics project delivery. Core SnowPro Certificate is highly desired EXPERIENCE AND SKILLS 8+ years of experience as a Data Engineer/Data Modeler, preferably in a cloud-based environment . Strong experience with data modelling and data warehouse and data integration. Deep understanding of Snowflake , including performance optimization and best practices. Strong SQL skill is mandatory Solid understanding of data modeling concepts to build effective pipelines. Familiarity with DevOps workflows and working in environments with CI/CD, version control (Git, Bitbucket), and automated deployments . Strong problem-solving skills and the ability to troubleshoot pipeline issues effectively. KNOWLEDGE Knowledge of data management / governance principles and their importance to dealing with data risk (for MetLife and relationship with regulators). Life Insurance or Banking experience including knowledge of financial / actuarial valuation methods and processes (preferred).

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

About the job We are seeking a highly skilled and motivated AI-ML Lead with expertise in Generative AI and LLMs to join our team. As a Generative AI and LLM Expert, you will play a crucial role in developing and implementing cutting-edge generative models and algorithms to solve complex problems and generate high-quality outputs. You will collaborate with a multidisciplinary team of researchers, engineers, and data scientists to explore innovative applications of generative AI across various domains. Responsibilities: Research and Development: Stay up-to-date with the latest advancements in generative AI, including LLMs, GPTs, GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and other related techniques. Conduct research to identify and develop novel generative models and algorithms. Model Development: Design, develop, and optimize generative models to generate realistic and diverse outputs. Implement and fine-tune state-of-the-art generative AI architectures to achieve desired performance metrics. Data Processing and Preparation: Collect, preprocess, and curate large-scale datasets suitable for training generative models. Apply data augmentation techniques and explore strategies to handle complex data types and distributions. Training and Evaluation: Train generative models using appropriate deep learning frameworks and libraries. Evaluate model performance using quantitative and qualitative metrics. Iterate and improve models based on feedback and analysis of results. Collaboration: Collaborate with cross-functional teams, including researchers, engineers, and data scientists, to understand project requirements, define objectives, and identify opportunities to leverage generative AI techniques. Provide technical guidance and support to team members. Innovation and Problem Solving: Identify and tackle challenges related to generative AI, such as mode collapse, training instability, and generating diverse and high-quality outputs. Propose innovative solutions and approaches to address these challenges. Documentation and Communication: Document research findings, methodologies, and model architectures. Prepare technical reports, papers, and presentations to communicate results and insights to both technical and non-technical stakeholders. Requirements: Education: A Master's or Ph.D. degree in Computer Science, Artificial Intelligence, or a related field. A strong background in deep learning, generative models, and computer vision is preferred. Experience: Proven experience in designing and implementing generative models using deep learning frameworks (e.g., TensorFlow, PyTorch). Demonstrated expertise in working with GPTs, GANs, VAEs, or other generative AI techniques. Experience with large-scale dataset handling and training deep neural networks is highly desirable. Technical Skills: Proficiency in programming languages such as Python, and familiarity with relevant libraries and tools. Strong mathematical and statistical skills, including linear algebra and probability theory. Experience with cloud computing platforms and GPU acceleration is a plus. Research and Publication: Track record of research contributions in generative AI, demonstrated through publications in top-tier conferences or journals. Active participation in the AI research community, such as attending conferences or workshops, is highly valued. Analytical and Problem-Solving Abilities: Strong analytical thinking and problem-solving skills to tackle complex challenges in generative AI. Ability to think creatively and propose innovative solutions. Attention to detail and the ability to analyze and interpret experimental results. Collaboration and Communication: Excellent teamwork and communication skills to effectively collaborate with cross-functional teams. Ability to explain complex technical concepts to both technical and non-technical stakeholders. Strong written and verbal communication skills. Adaptability and Learning: Enthusiasm for staying updated with the latest advancements in AI and generative models. Willingness to learn new techniques and adapt to evolving technologies and methodologies.

Posted 1 month ago

Apply

3.0 - 7.0 years

25 - 27 Lacs

Noida, Hyderabad

Work from Office

Position Summary MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science. Role Value Proposition The Business analyst data modeler is an important role in the data and analytics (D&A) organization. The role ensures, data is structured, organized and is represented effectively aligned to the needs of the organization. The role helps design logical & physical model which include implementation of robust data models that accurately capture, store, and manage data end to end. Job Responsibilities Perform data modeling activity (Logical, Physical) using data modeling tool CA Erwin Data Modeler Ability to gather, understand & analyze business requirements accurately Ability to analyze data using SQL Partnering with other teams to understand data needs & translating it into effective data models. Ability to collaborate with stakeholders to provide domain-based solutions. Experience in implementing industry data modeling standards, best practices, and emerging technologies in data modeling Hands-on experience with API development and integration (REST, SOAP, JSON, XML). Education, Technical Skills & Other Critical RequirementEducation Bachelors degree in computer science, Engineering, or a related tech or business discipline Experience (In Years) 3-5 years Hands on experience in CA Erwin 3-5 years experience in SQL, data modelling and data analysis Ability to communicate effectively Understanding of API Design and Ingestion Ability to communicate effectively both orally and in writing with various levels of management, including translating complex ideas and data into actionable steps for business units Technical Skills Strong SQL skills Hands-on experience with ERWIN tool Familiarity with Agile best practices Strong collaboration and facilitation skills Other Preferred Skills Familiarity with Azure cloud Experience in a client-facing role/environment.

Posted 1 month ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Key Responsibilities: Develop APIs and microservices using Spring Boot. Implement integrations using APIGEE for API management. Work with Pivotal Cloud Foundry (PCF) and manage deployments. Leverage both AWS and Azure for cloud integration tasks. Create and manage data models using tools like Erwin, Vision, or Lucidchart. Required Skills: 5+ years of experience in integration development. Proficiency in Spring Boot and APIGEE. Expertise in Pivotal Cloud Foundry (PCF). Strong knowledge of AWS and Azure. Experience with data modeling tools (Erwin, Vision, Lucidchart). Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Pune

Work from Office

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools Erwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQL databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

As a Senior Backend /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise * 10+ years of Software development experience with zOS or zOS Sub-systems. * 8+ years Professional experience developing with Golang, Python and Ruby * Hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyse issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback.

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and industry standards, facilitating seamless data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and design processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of relational and non-relational database systems.- Familiarity with data warehousing concepts and ETL processes.- Experience in using data modeling tools such as Erwin or IBM InfoSphere Data Architect. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Pune.- A 15 years full time education is required.-Must have skills-Snowflake Data Vault 2.0 Modeler Qualification 15 years full time education

Posted 1 month ago

Apply

6.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Roles and Responsibility Design and implement data models, data flows, and data pipelines to support business intelligence and analytics. Develop and maintain large-scale data warehouses and data lakes using various technologies such as Hadoop, Spark, and NoSQL databases. Collaborate with cross-functional teams to identify business requirements and develop solutions that meet those needs. Ensure data quality, integrity, and security by implementing data validation, testing, and monitoring processes. Stay up-to-date with industry trends and emerging technologies to continuously improve the organization's data architecture capabilities. Provide technical leadership and guidance on data architecture best practices to junior team members. Job Requirements Strong understanding of data modeling, data warehousing, and ETL processes. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Excellent problem-solving skills and ability to analyze complex business problems and develop creative solutions. Strong communication and collaboration skills to work effectively with stakeholders at all levels. Ability to design and implement scalable, secure, and efficient data architectures. Experience working in an agile environment with continuous integration and delivery.

Posted 1 month ago

Apply

6.0 - 9.0 years

4 - 8 Lacs

Telangana

Work from Office

Bachelor's or Master's degree in Computer Science, Information Systems, Finance, or a related field. Minimum of 5 years of experience in data modeling, with a strong preference for candidates with capital markets experience. Expert knowledge of financial products, trade lifecycle, and market data. Demonstrated experience in modeling trade data using industry-standard protocols and schemas such as FpML (Financial Products Markup Language), ISDA CDM (International Swaps and Derivatives Association Common Domain Model), FIX (Financial Information eXchange), SWIFT, or ISO20022. Proficiency in data modeling tools (e.g., ERwin, PowerDesigner, IBM Data Architect) and familiarity with database technologies (SQL, NoSQL). Experience with data warehousing, ETL processes, and big data platforms. Excellent analytical, problem-solving, and organizational skills. Effective communication skills, with the ability to interact with a variety of stakeholders. Understanding of financial regulations (e.g., GDPR, MiFID II, Dodd-Frank) and their impact on data management. Ability to work independently as well as collaboratively in a team environment. Relevant professional certifications (e.g., CFA, FRM) are considered an asset.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Design and implement data architectures and models, focusing on data warehouses and Snowflake-based environments. Ensure that data is structured for efficient querying and analysis, aligning with business goals and performance requirements.

Posted 1 month ago

Apply

5.0 - 8.0 years

16 - 30 Lacs

Kolkata

Hybrid

Data Modeler Hybrid Data Environments Job Summary: We are in search of an experienced Data Modeler who possesses a deep understanding of traditional data stores such as SQL Server and Oracle DB, as well as proficiency in Azure/Databricks cloud environments. The ideal candidate will be adept at comprehending business processes and deriving methods to define analytical data models that support enterprise-level analytics, insights generation, and operational reporting. Key Responsibilities: - Collaborate with business analysts and stakeholders to understand business processes and requirements, translating them into data modeling solutions. - Design and develop logical and physical data models that effectively capture the granularity of data necessary for analytical and reporting purposes. - Migrate and optimize existing data models from traditional on-premises data stores to Azure/Databricks cloud environments, ensuring scalability and performance. - Establish data modeling standards and best practices to maintain the integrity and consistency of the data architecture. - Work closely with data engineers and BI developers to ensure that the data models support the needs of analytical and operational reporting. - Conduct data profiling and analysis to understand data sources, relationships, and quality, informing the data modeling process. - Continuously evaluate and refine data models to accommodate evolving business needs and to leverage new data modeling techniques and cloud capabilities. - Document data models, including entity-relationship diagrams, data dictionaries, and metadata, to provide clear guidance for development and maintenance. - Provide expertise in data modeling and data architecture to support the development of data governance policies and procedures. Qualifications: - Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. - Minimum of 5 years of experience in data modeling, with a strong background in both traditional RDBMS and modern cloud-based data platforms. - Proficiency in SQL and experience with data modelling tools (e.g., ER/Studio, ERwin, PowerDesigner). - Familiarity with Azure cloud services, Databricks, and other big data technologies. - Understanding of data warehousing concepts, including dimensional modeling, star schemas, and snowflake schemas. - Ability to translate complex business requirements into effective data models that support analytical and reporting functions. - Strong analytical skills and attention to detail. - Excellent communication and collaboration abilities, with the capacity to engage with both technical and non-technical stakeholders.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies