Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
7 - 16 Lacs
bengaluru
Remote
Job Description: Practical experiences in the following technologies with more than 5 years Snowflake, PowerBI, WhereScape RED +3D, DataVault 2.0 very strong communication skills Practical experiences in requirements engineering Location: Hyderabad (preferred) or any other location in India Work Result: running system Skill Area: Data Management & Analytics Technology: Snowflake Proficiency - Technology: Expert
Posted 2 weeks ago
12.0 - 20.0 years
40 - 75 Lacs
pune, bengaluru, delhi / ncr
Hybrid
Architect and lead end-to-end data migration strategies for large-scale PTC Windchill implementations and upgrades. Define and document migration frameworks, methodologies, and best practices. Design data mapping models for CAD.
Posted 3 weeks ago
5.0 - 10.0 years
12 - 20 Lacs
chennai, bengaluru
Hybrid
Role & responsibilities Task Description: - Practical experiences in the following technologies with more than 3 years Python, Azure Data Factory, Data Bricks, WhereScape 3D, WhereScape RED, DataVault 2.0, Snowflake, SQL - very strong communication skills - Practical experiences in requirements engineering -Location: Hyderabad (preferred) or any other location in India Work Result: running system Skill Area: Data Management & Analytics Technology: Snowflake Proficiency - Technology: Expert Preferred candidate profile
Posted 3 weeks ago
8.0 - 12.0 years
12 - 22 Lacs
chennai, bengaluru
Hybrid
Role & responsibilities Task Description: - Practical experiences in the following technologies with more than 5 years Snowflake, PowerBI, WhereScape RED +3D, DataVault 2.0 - very strong communication skills - Practical experiences in requirements engineering -Location: Hyderabad (preferred) or any other location in India Work Result: running system Skill Area: Data Management & Analytics Technology: Snowflake Proficiency - Technology: Expert Preferred candidate profile
Posted 3 weeks ago
12.0 - 15.0 years
15 - 19 Lacs
noida, hyderabad
Work from Office
About the Role: We are looking for a Principal Data Engineer to lead the design and delivery of scalable data solutions using Azure Data Factory and Azure Databricks. This is a consulting-focused role that requires strong technical expertise, stakeholder engagement, and architectural thinking. You will work closely with business, functional, and technical teams to define data strategies, design robust pipelines, and ensure smooth delivery in an Agile environment. Responsibilities Collaborate with business and technology stakeholders to gather and understand data needs Translate functional requirements into scalable and maintainable data architecture Design and implement robust data pipelines Lead data modeling, transformation, and performance optimization efforts Ensure data quality, validation, and consistency Participate in Agile ceremonies including sprint planning and backlog grooming Support CI/CD automation for data pipelines and integration workflows Mentor junior engineers and promote best practices in data engineering Must Have 12+ years of IT experience, with at least 5 years in data architecture roles in modern metadata driven and cloud-based technologies, bringing a software engineering mindset Strong analytical and problem-solving skills - Ability to determine data patterns and perform root cause analysis to resolve production issues Excellent communication skills, with experience in leading client-facing discussion Strong hands-on experience with Azure Data Factory and Databricks, leveraging custom solutioning and design beyond drag-and-drop capabilities for big data workloads Demonstrated proficiency in SQL, Python, and Spark Experience with CI/CD pipelines, version control and DevOps tools Experience with applying dimensional and Data Vault methodologies Background in working with Agile methodologies and sprint-based delivery Ability to produce clear and comprehensive technical documentation Nice to Have Experience with Azure Synapse and Power BI Experience with Microsoft Purview and/or Unity Catalog Understanding of Data Lakehouse and Data Mesh concepts Familiarity with enterprise data governance and quality frameworks Manufacturing experience within the operations domain
Posted 3 weeks ago
3.0 - 6.0 years
5 - 9 Lacs
pune
Work from Office
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications.
Posted 3 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
gurugram
Work from Office
Role Description: As a Senior Software Engineer - BI & Visualization - Power BI at Incedo, you will be responsible for designing and developing business intelligence (BI) dashboards and visualizations to support business decision-making. You will work with business analysts and data architects to understand business requirements and translate them into technical solutions. You will be skilled in BI tools such as Tableau or Power BI and have experience in database management systems such as Oracle or SQL Server. You will be responsible for ensuring that BI dashboards Roles & Responsibilities: Designing and developing business intelligence (BI) and visualization solutions using tools like Power BI. Creating and maintaining data pipelines and ETL processes Collaborating with other teams to ensure the consistency and integrity of data Providing guidance and mentorship to junior software engineers Troubleshooting and resolving BI and visualization platform issues Technical Skills Skills Requirements: Proficiency in data visualization tools such as Power BI. Knowledge of database technologies such as SQL Server, Oracle, or MySQL. Understanding of data modeling and data warehouse concepts such as star schema, snowflake schema, or data vault. Familiarity with ETL tools and techniques such as Talend, Informatica, or SSIS. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
3.0 - 5.0 years
5 - 10 Lacs
gurugram
Work from Office
Role Description As a Software Engineer - BI & Visualization - Power BI at Incedo, you will be responsible for designing and developing business intelligence (BI) dashboards and visualizations to support business decision-making. You will work with business analysts and data architects to understand business requirements and translate them into technical solutions. You will be skilled in BI tools such as Tableau or Power BI and have experience in database management systems such as Oracle or SQL Server. You will be responsible for ensuring that BI dashboards Roles & Responsibilities: Designing and developing business intelligence (BI) and visualization solutions using tools like Power BI. Creating and maintaining data pipelines and ETL processes Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving BI and visualization platform issues Technical Skills Skills Requirements: Proficiency in data visualization tools such as Power BI. Knowledge of database technologies such as SQL Server, Oracle, or MySQL. Understanding of data modeling and data warehouse concepts such as star schema, snowflake schema, or data vault. Familiarity with ETL tools and techniques such as Talend, Informatica, or SSIS. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 weeks ago
6.0 - 10.0 years
22 - 30 Lacs
hyderabad, chennai, bengaluru
Hybrid
Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Contribute to Data Modeling accelerators • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Use the Data Modelling tool to create appropriate data models. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers to strategize ingestion logic and consumption patterns. Job Requirement Expertise and Qualifications What do we expect? • 6+ years of experience in Data space. • Decent SQL skills. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry
Posted 3 weeks ago
10.0 - 14.0 years
30 - 37 Lacs
hyderabad, chennai, bengaluru
Hybrid
Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
You will be a highly skilled Senior Business Development professional joining our team at VRetail with extensive experience in SAAS, IT Sales, and B2B environments. Your role will be crucial in driving our sales strategy and enhancing our data management capabilities within the enterprise sector. Your key responsibilities will include developing and implementing sales strategies for Data Vault solutions tailored to enterprise clients. You will collaborate with cross-functional teams to ensure alignment with business objectives and client needs. Analyzing market trends and customer feedback to identify opportunities for product improvement and innovation will also be part of your role. Building and maintaining strong relationships with key stakeholders, including clients, partners, and internal teams will be essential. You will provide expert guidance on Data Vault methodologies and best practices to clients and internal teams, as well as conduct presentations and demonstrations of Data Vault solutions to potential clients. Mentoring and training junior sales staff on effective sales techniques and product knowledge will also be a key responsibility. To qualify for this role, you should have a Bachelor's degree in Business, Information Technology, or a related field, along with a minimum of 5 years of experience in sales, specifically in SAAS, IT Sales, and B2B sectors. A strong understanding of Data Vault concepts and methodologies is required, along with a proven track record of meeting or exceeding sales targets in a competitive environment. Excellent communication and interpersonal skills are a must, with the ability to build rapport with clients and team members. Strong analytical skills and the ability to interpret complex data to inform sales strategies are also essential. Preferred skills for this role include experience with CRM software and sales analytics tools, familiarity with data management and analytics solutions, and the ability to work independently and as part of a team in a fast-paced environment.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
Your Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL). Collaborate with solution teams and Data Architects to implement data strategies, build data flows, and develop logical/physical data models. Work with Data Architects to define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Engage in hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Proactively and independently address project requirements and articulate issues/challenges to reduce project delivery risks. Your Profile Bachelor's degree in computer/data science technical or related experience. Possess 7+ years of hands-on relational, dimensional, and/or analytic experience utilizing RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols. Demonstrated experience with data warehouse, Data Lake, and enterprise big data platforms in multi-data-center contexts. Proficient in metadata management, data modeling, and related tools (e.g., Erwin, ER Studio). Preferred experience with services in Azure/Azure Databricks (Azure Data Factory, Azure Data Lake Storage, Azure Synapse & Azure Databricks) and working on SAP Datasphere is a plus. Experience in team management, communication, and presentation. Understanding of agile delivery methodology and experience working in a scrum environment. Ability to translate business needs into data vault and dimensional data models supporting long-term solutions. Collaborate with the Application Development team to implement data strategies, create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Maintain logical and physical data models along with corresponding metadata. Develop best practices for standard naming conventions and coding practices to ensure data model consistency. Recommend opportunities for data model reuse in new environments. Perform reverse engineering of physical data models from databases and SQL scripts. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Develop data models according to company standards. Guide System Analysts, Engineers, Programmers, and others on project limitations and capabilities, performance requirements, and interfaces. Review modifications to existing data models to improve efficiency and performance. Examine new application design and recommend corrections as needed. #IncludingYou Diversity, equity, inclusion, and belonging are cornerstones of ADM's efforts to continue innovating, driving growth, and delivering outstanding performance. ADM is committed to attracting and retaining a diverse workforce and creating welcoming, inclusive work environments that enable every ADM colleague to feel comfortable, make meaningful contributions, and grow their career. ADM values the unique backgrounds and experiences that each person brings to the organization, understanding that diversity of perspectives makes us stronger together. For more information regarding ADM's efforts to advance Diversity, Equity, Inclusion & Belonging, please visit the website: Diversity, Equity and Inclusion | ADM. About ADM At ADM, the power of nature is unlocked to provide access to nutrition worldwide. With industry-advancing innovations, a comprehensive portfolio of ingredients and solutions catering to diverse tastes, and a commitment to sustainability, ADM offers customers an edge in addressing nutritional challenges. As a global leader in human and animal nutrition and the premier agricultural origination and processing company worldwide, ADM's capabilities in insights, facilities, and logistical expertise are unparalleled. From ideation to solution, ADM enriches the quality of life globally. Learn more at www.adm.com.,
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
dehradun, uttarakhand
On-site
You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer. About the Role In this opportunity as Data Engineer, you will: Develop/enhance data warehousing functionality including the use and management of Snowflake data warehouse and the surrounding entitlements, pipelines and monitoring, in partnership with Data Analysts and Architects with guidance from lead Data Engineer Innovate with new approaches to meeting data management requirements Effectively communicate and liaise with other data management teams embedded across the organization and data consumers in data science and business analytics teams. Analyze existing data pipelines and assist in enhancing and re-engineering the pipelines as per business requirements. Bachelors degree or equivalent required, Computer Science or related technical degree preferred About You Youre a fit for the role if your background includes: Mandatory skills Data Warehousing, data models, data processing[ Good to have], SQL, Power BI / Tableau, Snowflake [good to have] , Python 3.5 + years of relevant experience in Implementation of data warehouse and data management of data technologies for large scale organizations Experience in building and maintaining optimized and highly available data pipelines that facilitate deeper analysis and reporting Worked on Analyzing data pipelines Knowledgeable about Data Warehousing, including data models and data processing Broad understanding of the technologies used to build and operate data and analytic systems Excellent critical thinking, communication, presentation, documentation, troubleshooting and collaborative problem-solving skills Beginner to intermediate Knowledge of AWS, Snowflake, Python Hands-on experience with programming and scripting languages Knowledge of and hands on experience with Data Vault 2.0 is a plus Also have experience in and comfort with some of the following skills/concepts: Good in writing SQL and performance tuning Data Integration tools lie DBT, Informatica, etc. Intermediate in programming language like Python/PySpark/Java/JavaScript AWS services and management, including Serverless, Container, Queueing and Monitoring services Consuming and building APIs. #LI-SM1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
delhi
On-site
You are a skilled and experienced Business Analyst with a strong knowledge and expertise in Property and Casualty (P&C) insurance. You possess a comprehensive understanding of P&C insurance principles, processes, and systems. Your main responsibilities will include collaborating with stakeholders to gather and analyze business requirements relevant to P&C insurance operations. You will conduct insightful research into P&C insurance products, markets, and competitors to identify enhancements and opportunities. Leveraging your extensive P&C insurance knowledge, you will design and develop impactful solutions for business challenges. You will partner with cross-functional teams to ensure the seamless implementation of developed business solutions. Additionally, you will provide in-depth expertise for the development and upgrading of P&C insurance systems and applications. Your role will also involve performing meticulous data analysis and validation to maintain system data accuracy, as well as creating and maintaining comprehensive documentation such as functional specifications, business process flows, and user manuals. To excel in this role, you are expected to have a deep understanding of P&C insurance principles, products, and methodologies. You should demonstrate proven expertise in PL/SQL with the ability to craft intricate queries for database data manipulation and analysis. Familiarity with dimensional data marts and their applications in data warehousing settings is essential. Proficiency in business intelligence tools for developing reports and dashboards specific to P&C insurance analytics is required. Your exceptional analytical skills will enable you to translate complex requirements into tangible functional specifications. Strong communication proficiency is necessary for efficient collaboration with both technical and non-technical parties. Your rigorous attention to detail emphasizes data integrity and precision. You should have the capacity to independently handle tasks within dynamic team settings, adhering to multiple priorities and deadlines. A bachelor's degree in Business Administration, Computer Science, or related disciplines is preferred, and notable experience within the P&C insurance industry will be advantageous. Your skills should include data validation, business intelligence tools, insurance knowledge, proficiency in the policy and claim lifecycle, data analysis, analytical skills, data modeling, PL/SQL expertise, BRD creation, communication skills, source to target mapping, data profiling, SQL knowledge, attention to detail, documentation abilities, analytics, property and casualty (P&C) insurance understanding, data vault knowledge, and reporting and dashboard development proficiency.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
jaipur, rajasthan
On-site
You are a Sr. Data Engineer with a strong background in building ELT pipelines and expertise in modern data engineering practices. You are experienced with Databricks and DBT, proficient in SQL and Python, and have a solid understanding of data warehousing methodologies such as Kimball or Data Vault. You are comfortable working with DevOps tools, particularly within AWS, Databricks, and GitLab. Your role involves collaborating with cross-functional teams to design, develop, and maintain scalable data infrastructure and pipelines using Databricks and DBT. Your responsibilities include designing, building, and maintaining scalable ELT pipelines for processing and transforming large datasets efficiently in Databricks. You will implement Kimball data warehousing methodologies or other multi-dimensional modeling approaches using DBT. Leveraging AWS, Databricks, and GitLab, you will implement CI/CD practices for data engineering workflows. Additionally, you will optimize SQL queries and database performance, monitor and fine-tune data pipelines and queries, and ensure compliance with data security, privacy, and governance standards. Key qualifications for this role include 6+ years of data engineering experience, hands-on experience with Databricks and DBT, proficiency in SQL and Python, experience with Kimball data warehousing or Data Vault methodologies, familiarity with DevOps tools and practices, strong problem-solving skills, and the ability to work in a fast-paced, agile environment. Preferred qualifications include experience with Apache Spark for large-scale data processing, familiarity with CI/CD pipelines for data engineering workflows, understanding of orchestration tools like Apache Airflow, and certifications in AWS, Databricks, or DBT. In return, you will receive benefits such as medical insurance for employees, spouse, and children, accidental life insurance, provident fund, paid vacation time, paid holidays, employee referral bonuses, reimbursement for high-speed internet at home, one-month free stay for employees moving from other cities, tax-free benefits, and other bonuses as determined by management.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are an experienced Data Modeller with specialized knowledge in designing and implementing data models for modern data platforms, specifically within the healthcare domain. Your expertise includes a deep understanding of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. Your role involves translating complex business requirements into efficient and scalable data models that support analytics and reporting needs. You will be responsible for designing and implementing logical and physical data models for Databricks Lakehouse implementations. Collaboration with business stakeholders, data architects, and data engineers is crucial to create data models that facilitate the migration from legacy systems to the Databricks platform while ensuring data integrity, performance, and compliance with healthcare industry standards. Key responsibilities include creating and maintaining data dictionaries, entity relationship diagrams, and model documentation. You will develop dimensional models, data vault models, and other relevant modeling approaches. Additionally, supporting the migration of data models, ensuring alignment with overall data architecture, and implementing data modeling best practices are essential aspects of your role. Your qualifications include extensive experience in data modeling for analytics and reporting systems, strong knowledge of dimensional modeling, data vault, and other methodologies. Proficiency in Databricks platform, Delta Lake architecture, healthcare data modeling, and industry standards is required. You should have experience in migrating data models from legacy systems, strong SQL skills, and understanding of data governance principles. Technical skills that you must possess include expertise in data modeling methodologies, Databricks platform, SQL, data definition languages, data warehousing concepts, ETL/ELT processes, performance tuning, metadata management, data cataloging, cloud platforms, big data technologies, and healthcare industry knowledge. Your knowledge should encompass healthcare data structures, terminology, coding systems, data standards, analytics use cases, regulatory requirements, clinical and operational data modeling challenges, and population health and value-based care data needs. Your educational background should include a Bachelor's degree in Computer Science, Information Systems, or a related field, with an advanced degree being preferred. Professional certifications in data modeling or related areas would be advantageous for this role.,
Posted 1 month ago
3.0 - 7.0 years
12 - 15 Lacs
Pune
Work from Office
About The Role : Job Title: Database Engineer - CF, AS LocationPune, India Corporate TitleAS Role Description DWS Technology is a global team of technology specialists, spread across multiple trading hubs and tech centres. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. Our India location is our most recent addition to our global network of tech centres and growing strongly. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create #GlobalHausbank solutions from our home market. DWS Corporate Function Technology DWS Corporate Function Technology team covers technology for corporate function like finance, risk, ALM, AFC, etc. This position is for the SIMS application specifically which is a financial data warehouse which provides KPIs for management, quarterly and annual reporting among other things. The application consists of an Oracle database and a Java web front-end. As a database Engineer, you will be responsible for maintaining, enhancing, and optimizing the application in collaboration with the engineering team and business. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You will support the application team by maintaining, enhancing, and optimizing an Oracle-based financial data warehouse application (SIMS) and well as a web-application with a MSSQL database backend (FBRAE) with a focus on delivering robust solutions to meet business needs. Your responsibilities include: Collaborating with business stakeholders to design and implement new features, primarily through database development (PL/SQL, T-SQL) Ensuring application stability by analyzing and resolving data-related inquiries from the business, performing performance tuning, and optimizing processes Maintaining and enhancing reporting data marts built on a Data Vault architecture Supporting the team in migrating the applications front-end to a modern ReactJS/Spring Boot technology stack, leveraging a Microservices-oriented architecture hosted on the Google Cloud Platform Your skills and experience Masters degree (or equivalent) in Computer Science, Business Information Technology, or a related field Demonstrated expertise in Oracle PL/SQL or MS T-SQL development, with significant professional experience working on relational databasesthis is a critical requirement for the role Strong analytical and problem-solving skills Familiarity with an ETL tool (e.g., Informatica) and/or a reporting tool (e.g., Cognos) is desirable Experience in one or more of the following areas is advantageousbatch programming, Java/JavaScript programming (including ReactJS), or Microservices architecture Fluency in written and spoken English How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.
Posted 1 month ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education
Posted 1 month ago
6.0 - 11.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Location - Bangalore Skills and Qualifications : At least 6+ years' relevant experience would generally be expected to find the skills required for this role. 6+ years of being a practitioner in data engineering or a related field. Strong programming skills in Python, with experience in data manipulation and analysis libraries (e.g., Pandas, NumPy, Dask). Proficiency in SQL and experience with relational databases (e.g., Sybase, DB2, Snowflake, PostgreSQL, SQL Server). Experience with data warehousing concepts and technologies (e.g., dimensional modeling, star schema, data vault modeling, Kimball methodology, Inmon methodology, data lake design). Familiarity with ETL/ELT processes and tools (e.g., Informatica PowerCenter, IBM DataStage, Ab Initio) and open-source frameworks for data transformation (e.g., Apache Spark, Apache Airflow). Experience with message queues and streaming platforms (e.g., Kafka, RabbitMQ). Experience with version control systems (e.g., Git). Experience using Jupyter notebooks for data exploration, analysis, and visualization. Excellent communication and collaboration skills. Ability to work independently and as part of a geographically distributed team. Nice to have Understanding of any cloud-based application development & Dev Ops. Understanding of business intelligence tools - Tableau,PowerBI Understanding of Trade Lifecycle / Financial markets. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Pune
Work from Office
Educational Requirements Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional Requirements: As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning, and modeling. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands-on knowledge and experience with tools and techniques for analysis, data manipulation, and presentation (e.g. PL/SQL, PySpark, Hive, Impala, and other scripting tools). Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence). Expertise in conceptual modeling; ability to see the big picture and envision possible solutions. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. Experience in data warehousing concepts including Star schema, snowflake, or data vault for data mart or data warehousing. Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment. Excellent communication & stakeholder management skills. You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. We're committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events, yoga challenges, or marathons. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 month ago
3.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.
Posted 1 month ago
3.0 - 6.0 years
5 - 9 Lacs
Pune
Work from Office
Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |