Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeler at our company, you will play a crucial role in designing conceptual, logical, and physical models for Azure Databricks and Azure Data Lake to support structured, semi-structured, and unstructured data. Your responsibilities will include: - Utilizing your 6+ years of experience in data modeling, with a preference for insurance industry datasets such as policies, claims, customer, or actuarial data. - Demonstrating advanced skills in data modeling tools like Erwin, ER/Studio, PowerDesigner, or Microsoft Visio, and version control using GitHub. - Applying deep understanding of relational, dimensional, and data lake modeling techniques optimized for Databricks/Spark-based processing. - Modeling and documenting metadata, reference data, and master data with Informatica to support robust data governance and quality. - Utilizing strong SQL and Spark skills for data profiling, validation, and prototyping in Databricks environments. - Ensuring compliance with regulatory and compliance requirements for insurance data, such as IFRS 17 and Solvency II. Regarding the company, Virtusa values teamwork, quality of life, and professional development. With a global team of 27,000 professionals, we are dedicated to providing exciting projects, opportunities, and working with cutting-edge technologies to support your career growth. At Virtusa, we foster a collaborative team environment that encourages new ideas and excellence. Join us to unleash your potential and contribute to our dynamic work culture.,
Posted 16 hours ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
We are seeking an experienced Business/Data Modeler to create scalable and efficient data models supporting customer support data and reporting. You will collaborate with business stakeholders, analysts, and data engineering teams to transform operational needs into robust data structures for reporting, analytics, and machine learning applications. Your responsibilities will include analyzing customer support processes to develop logical and physical data models. You will work closely with business teams to gather requirements, define KPIs, metrics, and reporting needs, and design normalized and dimensional data models using various modeling techniques. Additionally, you will be responsible for developing data model domains, subject areas, and standards for Customer Support based on business and company requirements, ensuring optimal design for Agentic AI analytics, and documenting source-to-target mappings and data lineage. To qualify for this role, you should have 10+ years of experience in data modeling, preferably in customer support, contact center, or operational analytics domains. Strong knowledge of relational and dimensional modeling principles, industry modeling standards, and experience with Data Mesh and Data Fabric concepts are required. Proficiency in SQL, familiarity with data platforms like Hive and Hudi, and experience with data modeling tools are essential. Additionally, an understanding of support-related KPIs, data governance principles, and the ability to collaborate effectively with cross-functional teams are necessary for success in this role. This position was posted by Shakti Mishra from Nineleaps.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for designing and implementing various data models, including OLAP, OLTP, dimensional, relational, and data vault models. Your role will involve creating robust data architectures to support business intelligence, data warehousing, and database management solutions. As a Data Modeler, you should have strong experience in data modeling and a solid understanding of OLAP, OLTP, dimensional, relational, and data vault models. Proficiency in data modeling tools such as Erwin, ER Studio, Toad, SQL DBM, and Oracle DBM is essential for this role. Familiarity with SDE (Software Development Environment) is preferred. If you have a passion for data modeling and possess the required expertise in designing and implementing various data models, we encourage you to apply for this position.,
Posted 2 weeks ago
10.0 - 12.0 years
5 - 7 Lacs
Delhi, India
On-site
Key Responsibilities: Data Architecture: Lead the design, development, and implementation of comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. Data Transformation & ETL: Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. Customer-Centric Data Design: Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. Data Modeling: Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models to support analytical and operational needs. Query Optimization: Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. Data Warehouse Management: Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. Tool Evaluation & Implementation: Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. Business Requirements & Analysis: Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. Reporting & Analytics Support: Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. Software Development Practices: Apply professional software development principles and best practices to data solution delivery. Stakeholder Collaboration: Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. Project Management & Multi-tasking: Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. Strategic Thinking & Leadership: Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements: Strong experience with data transformation & ETL on large datasets. Experience with designing customer-centric datasets (e.g., CRM, Call Center, Marketing, Offline, Point of Sale). 5+ years of experience in Data Modeling (e.g., Relational, Dimensional, Columnar, Big Data). 5+ years of experience with complex SQL or NoSQL queries. Extensive experience in advanced Data Warehouse concepts. Proven experience with industry ETL tools (e.g., Informatica, Unifi). Solid experience in Business Requirements definition, structured analysis, process design, and use case documentation. Experience with Reporting Technologies (e.g., Tableau, PowerBI). Demonstrated experience in professional software development. Exceptional organizational skills with the ability to manage multiple simultaneous customer projects. Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. Must be self-managed, proactive, and customer-focused. Technical Skills: Cloud Platforms: Microsoft Azure Data Warehousing: Snowflake ETL Methodologies: Extensive experience in ETL processes and tools Data Transformation: Large-scale data transformation Data Modeling: Relational, Dimensional, Columnar, Big Data Query Languages: Complex SQL, NoSQL ETL Tools: Informatica, Unifi (or similar enterprise-grade tools) Reporting & BI: Tableau, PowerBI
Posted 1 month ago
10.0 - 12.0 years
5 - 7 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities: Data Architecture: Lead the design, development, and implementation of comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. Data Transformation & ETL: Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. Customer-Centric Data Design: Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. Data Modeling: Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models to support analytical and operational needs. Query Optimization: Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. Data Warehouse Management: Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. Tool Evaluation & Implementation: Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. Business Requirements & Analysis: Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. Reporting & Analytics Support: Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. Software Development Practices: Apply professional software development principles and best practices to data solution delivery. Stakeholder Collaboration: Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. Project Management & Multi-tasking: Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. Strategic Thinking & Leadership: Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements: Strong experience with data transformation & ETL on large datasets. Experience with designing customer-centric datasets (e.g., CRM, Call Center, Marketing, Offline, Point of Sale). 5+ years of experience in Data Modeling (e.g., Relational, Dimensional, Columnar, Big Data). 5+ years of experience with complex SQL or NoSQL queries. Extensive experience in advanced Data Warehouse concepts. Proven experience with industry ETL tools (e.g., Informatica, Unifi). Solid experience in Business Requirements definition, structured analysis, process design, and use case documentation. Experience with Reporting Technologies (e.g., Tableau, PowerBI). Demonstrated experience in professional software development. Exceptional organizational skills with the ability to manage multiple simultaneous customer projects. Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. Must be self-managed, proactive, and customer-focused. Technical Skills: Cloud Platforms: Microsoft Azure Data Warehousing: Snowflake ETL Methodologies: Extensive experience in ETL processes and tools Data Transformation: Large-scale data transformation Data Modeling: Relational, Dimensional, Columnar, Big Data Query Languages: Complex SQL, NoSQL ETL Tools: Informatica, Unifi (or similar enterprise-grade tools) Reporting & BI: Tableau, PowerBI
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Job Description: As a Data Modeler at PwC, you will play a crucial role in analyzing business needs, developing long-term data models, and ensuring the efficiency and consistency of data systems. Your expertise in data modeling, metadata management, and data system optimization will contribute to enhancing the overall performance of our data infrastructure. Key responsibilities include: - Analyzing and translating business needs into comprehensive data models. - Evaluating existing data systems and recommending improvements for optimization. - Defining rules to efficiently translate and transform data across various data models. - Collaborating with the development team to create conceptual data models and data flows. - Developing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility and efficiency. - Implementing data strategies and developing physical data models to meet business requirements. - Utilizing canonical data modeling techniques to enhance the efficiency of data systems. - Evaluating implemented data systems for variances, discrepancies, and optimal performance. - Troubleshooting and optimizing data systems to ensure seamless operation. Key expertise required: - Strong proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, PowerDesigner. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. - Familiarity with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Experience with ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 3 to 5 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, PowerDesigner, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, the infrastructure team focuses on designing and implementing secure IT systems that support business operations. The primary goal is to ensure the smooth functioning of networks, servers, and data centers to enhance performance and minimize downtime. In the infrastructure engineering role at PwC, you will be tasked with creating robust and scalable technology infrastructure solutions for clients. This will involve working on network architecture, server management, and cloud computing. As a Data Modeler, we are seeking candidates with a solid background in data modeling, metadata management, and data system optimization. Your responsibilities will include analyzing business requirements, developing long-term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise for this role include: - Analyzing and translating business needs into long-term data model solutions. - Evaluating existing data systems and suggesting enhancements. - Defining rules for translating and transforming data across various models. - Collaborating with the development team to create conceptual data models and data flows. - Establishing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility. - Implementing data strategies and developing physical data models. - Updating and optimizing local and metadata models. - Utilizing canonical data modeling techniques to improve data system efficiency. - Evaluating implemented data systems for variances, discrepancies, and efficiency. - Troubleshooting and optimizing data systems for optimal performance. - Demonstrating strong expertise in relational and dimensional modeling (OLTP, OLAP). - Using data modeling tools like Erwin, ER/Studio, Visio, PowerDesigner effectively. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Understanding of NoSQL databases (MongoDB, Cassandra) and their data structures. - Experience with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Familiarity with ETL processes, data integration, and data governance frameworks. - Strong analytical, problem-solving, and communication skills. Qualifications for this position include: - A Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in infrastructure is dedicated to designing and implementing secure and robust IT systems that facilitate business operations. We focus on ensuring the smooth functioning of networks, servers, and data centers to enhance performance and reduce downtime. As part of the infrastructure engineering team at PwC, your role will involve creating and implementing scalable technology infrastructure solutions for our clients. This will encompass tasks such as network architecture, server management, and experience in cloud computing. We are currently seeking a Data Modeler with a solid background in data modeling, metadata management, and optimizing data systems. In this role, you will be responsible for analyzing business requirements, developing long-term data models, and maintaining the efficiency and consistency of our data systems. Key Responsibilities: - Analyze business needs and translate them into long-term data model solutions. - Evaluate existing data systems and suggest enhancements. - Define rules for data translation and transformation across different models. - Collaborate with the development team to design conceptual data models and data flows. - Establish best practices for data coding to ensure system consistency. - Review modifications to existing systems to ensure cross-compatibility. - Implement data strategies and create physical data models. - Update and optimize local and metadata models. - Utilize canonical data modeling techniques to improve system efficiency. - Evaluate implemented data systems for discrepancies, variances, and efficiency. - Troubleshoot and optimize data systems to achieve optimal performance. Key Requirements: - Proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, and PowerDesigner. - Strong skills in SQL and database management systems like Oracle, SQL Server, MySQL, and PostgreSQL. - Familiarity with NoSQL databases such as MongoDB and Cassandra, including their data structures. - Hands-on experience with data warehouses and BI tools like Snowflake, Redshift, BigQuery, Tableau, and Power BI. - Knowledge of ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or related areas. - 4+ years of practical experience in dimensional and relational data modeling. - Expertise in metadata management and relevant tools. - Proficiency in data modeling tools like Erwin, Power Designer, or Lucid. - Understanding of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions, such as AWS, Azure, and GCP. - Knowledge of big data technologies like Hadoop, Spark, and Kafka. - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Strong communication and presentation skills. - Excellent interpersonal skills to collaborate effectively with diverse teams.,
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |