Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
5 - 10 Lacs
Pune
Work from Office
Your Role Data Modeler Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Knowledge of enterprise databases such as DB2/Oracle/PostgreSQL/MYSQL/SQL Server. Hands on knowledge and experience with tools and techniques for analysis, data manipulation and presentation (e.g. PL/SQL, PySprak, Hive, Impala and other scripting tools) Experience with Software Development Lifecycle using the Agile methodology. Knowledge of agile methods (SAFe, Scrum, Kanban) and tools (Jira or Confluence) Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. Your Profile Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here Choosing Capgemini means having the opportunity to make a difference, whether for the worlds leading businesses or for society. It means getting the support you need to shape your career in the way that works for you. It means when the future doesnt look as bright as youd like, you have the opportunity to make change: to rewrite it. When you join Capgemini, you dont just start a new job. You become part of something bigger. A diverse collective of free-thinkers, entrepreneurs and experts, all working together to unleash human energy through technology, for an inclusive and sustainable future. At Capgemini, people are at the heart of everything we do! You can exponentially grow your career by being part of innovative projects and taking advantage of our extensive Learning & Development programs. With us, you will experience an inclusive, safe, healthy, and flexible work environment to bring out the best in you! You also get a chance to make positive social change and build a better world by taking an active role in our Corporate Social Responsibility and Sustainability initiatives. And whilst you make a difference, you will also have a lot of fun. About Capgemini
Posted 1 month ago
4.0 - 9.0 years
10 - 20 Lacs
Pune
Work from Office
Job Title : Data Modelling Manager Required Exp : 4-7yrs Job location : Pune (hybrid) (3 Days office 2 days Home) Schedule : 1PM -10 PM Mon-Fri Note: Looking for immediate joiners or who can join within 20 days. About Us: Insights is a mission-driven, start-up technology company focused on transforming the healthcare payer industry, ultimately creating a more personalized patient experience, improving health outcomes, and lowering the overall cost of healthcare. Insights provides a flexible, efficient, and secure platform that organizes and exchanges healthcare data from various sources and formats, allowing our customers to uncover differentiated insights that address their clients' needs. Our employees know that they play an active role in keeping our customers' data safe and are responsible for ensuring that our comprehensive policies and practices are met. With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. Through our platform, these health insurance payers can ingest and manage all the data they need to transform their business by supporting their analytical, operational, and financial needs. Since our founding in 2017, has built a highly successful SaaS business, raising more than $81 Million by leading VC firms who have deep expertise in the healthcare and technology industries. We are solving problems of massive scale and complexity in an industry that is not only ready for disruption. We're growing quickly and would love for you to be a part of it! About the Role: As our lead healthcare data expert, you will guide product managers, engineers, and operations staff on all aspects of healthcare data. Your deep experience to design, implement and document data modeling solutions will enable the team to nimbly build comprehensive products that optimally structure and organize data to meet our customer analytical and operations use cases. Your deep knowledge of identifying key business needs, define and govern data modeling design standards and best practices champion the usage of data with our customers. This will be key to the success of our highly differentiated, automated, and scalable data distribution capability (API Catalog, Data Marts, etc.) to serve diverse analytical and operational data consumption needs pushing the envelope on a cloud-first approach in healthcare data integration. All of this coupled with your passion for improving outcomes for all healthcare stakeholders will drive our data strategy. This represents a great opportunity to blaze your own trail working in a start-up organization in a cross-function, talented team to move the needle on some of the most pressing challenges in healthcare. You should expect to: Identify and define data requirements for data distribution/consumption use cases required by enterprise healthcare customers to include: Knowledge of data modeling, data integrity principles and SQL. Attributes, metrics, lookup/reference data required for Data Marts (e.g.: Risk Adjustment) and be able to explain to engineering: Data orientations/organization of the data required to build an ETL pipeline. Core vs. ancillary attributes/metrics in our lossless canonical models. Expected cardinality vs. outlier ranges for attributes/metrics. Attributes and calculations required to define healthcare industry metrics/KPI/KQIs. Typical queries users would ask of the data: Facts, Dimensions, JOINs. Data delivery patterns using complex events or business rules. Data enrichment (standardization, transformations, algorithms, grouping). Guaranteed data quality criteria required to meet specific SLAs to include data federation with business specific, governed data sources within an enterprise. Work with Engineering to manage, groom, and prioritize the product feature backlog to include creating epics and acceptance criteria. Help define data distribution implementation practices o Partner with health plan service providers to define and drive adoption of data distribution APIs/SDK. Work with Client Management for product support during client engagements Terrific if you have experience: In a Product Role with: Analyzing, visioning and road-mapping . Influencing and evangelizing key stakeholders (customers, partners, thought leaders, senior management) Agile, story writing, grooming, prioritizing, planning, showcasing and delivering MVP. In a Technical Role with: Healthcare IT Data analysis, modeling, and administration in cloud technologies. APIs/SDKs, business rules processing, data federation and data distribution technologies. Working with Engineering teams in highly technical environments. Bachelors degree in computer science, information systems, analytics, or related field, and/or equivalent.
Posted 1 month ago
8.0 - 12.0 years
20 - 30 Lacs
Hyderabad, Pune
Work from Office
Dear Candidate, One of our leading IT MNC Clients is looking for Data Analyst & Data Modeler on Immediate basis. Role: Senior Data Expert (Data Analyst & Data Modeler) Exp.: 8 12 Years Location: Hyderabad / Pune Mode of Work : WFO Skill: Datamodeler, SQL, MS-Office tools, GCP Big Query, Erwin, Visual Paradigm (preferable). In this role, you will: Support the delivery of complex transformation program development within data portfolio. Work within agile multi-skilled teams to create world class products to serve our customers needs. Perform the elicitation and analysis of business change, functional and non-functional requirements across a range of stakeholders, work with cross asset IT teams to interface those requirements and finally deliver a working reconciliation solution. Own and produce relevant analysis and modeling artefacts that enable development teams to develop working products. Understand the "user" journey end to end which goes beyond the "system”. Provide advanced business knowledge and technical support for requirements development. Create/enhance logical and physical data models by adhering to the agreed standards, to fulfil both business as well as technical requirements and expectations. Create and maintain the data models. Undertake the metadata analysis which includes but not limited to naming of the logical entity and attributes and physical table and columns, definitions and appropriate data type and length etc. To be successful in this role, you should meet the following requirements: Strong Data Analysis and/or Data Modeling experience of 8-12 years. Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews, ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping. Should be an individual contributor with good understanding of the SDLC & Agile methodologies. A team player with self-starter approach, having a sense of ownership, should be a problem solver with solution-seeking approach and an ability to work in fast-paced and continuously changing environment. Excellent communication & stakeholder management skills and should be capable of building rapport and relationships. Act as a liaison between business and technical teams to bridge any gaps and assist the business teams & technical teams to successfully deliver the projects. Other Skills & Tools: SQL, MS-Office tools, GCP Big Query, Erwin, Visual Paradigm (preferable). Comprehensive understanding of the Data Modeling (conceptual, logical, and physical), create and deliver high-quality data models, by following agreed data governance and standards. Maintain quality metadata and data related artefacts which should be accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid. If Interested, Send CVs to anji.g@kanarystaffing.com
Posted 1 month ago
4.0 - 9.0 years
20 - 35 Lacs
Hyderabad, Gurugram
Hybrid
Role & responsibilities Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. • Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. • Governs data design/modelling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. • Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Drive collaborative reviews of data model design, code, data, security features to drive data product development. • Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. • Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. • Partner with the data stewards team for data discovery and action by business customers and stakeholders. • Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. • Assist with data planning, sourcing, collection, profiling, and transformation. • Support data lineage and mapping of source system data to canonical data stores. • Create Source to Target Mappings (STTM) for ETL and BI developers. Preferred candidate profile Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) • Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. • Experience with version control systems like GitHub and deployment & CI tools. • Experience of metadata management, data lineage, and data glossaries is a plus. • Working knowledge of agile development, including DevOps and DataOps concepts. • Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail.
Posted 1 month ago
16.0 - 22.0 years
40 - 55 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.
Posted 1 month ago
9.0 - 14.0 years
22 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Description - Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Confirmation Questions: 1. This is a permanent employment opportunity with Pyramid IT . Are you fine with this? (Y/N) 3. This position may require working in shifts. Are you fine with this? (Y/N) 4. Current CTC: 5. Expected CTC: 6. Total Experience | Relevant Experience: 7. Official Notice Period: 8. If selected, how soon can you join (in days)? 9. Do you have a Notice Period Buyout option? (Y/N) 10. Preferred day & time for a telephonic/F2F interview on weekdays: 11. Are you currently holding any other offer? (Y/N) 12. Reason for job change: 13. PAN Card Number: 14. Aadhar Number: 15. Alternate Contact Number: Your prompt response along with your updated profile will be highly appreciated. Looking forward to your confirmation. Thanks & Regards Tagore Laya Sr. Executive - Resourcing M 7842100374 E Tagore.Laya@pyramidconsultinginc.com www.pyramidci.com
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Kochi
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Kochi
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Kolkata
Work from Office
Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Noida
Work from Office
Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Surat
Work from Office
Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
4.0 - 9.0 years
20 - 30 Lacs
Pune, Bengaluru
Hybrid
Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Kanpur
Remote
Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Hyderabad
Remote
Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Jaipur
Work from Office
Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Chennai
Remote
Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 10.0 years
8 - 14 Lacs
Lucknow
Remote
Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 month ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Chennai
Work from Office
Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.
Posted 1 month ago
0.0 - 2.0 years
2 - 4 Lacs
Bengaluru
Work from Office
Synechron is seeking a knowledgeable and proactive Data Modeler to guide the design and development of data structures that support our clients' business objectives. In this role, you will collaborate with cross-functional teams to translate business requirements into scalable and efficient data models, ensuring data accuracy, consistency, and integrity. You will contribute to creating sustainable and compliant data architectures that leverage emerging technologies such as cloud, IoT, mobile, and blockchain. Your work will be instrumental in enabling data-driven decision-making and operational excellence across projects.Software Required Skills: Strong understanding of data modeling concepts, methodologies, and tools Experience with data modeling for diverse technology platforms including cloud, mobile, IoT, and blockchain Familiarity with database management systems (e.g., relational, NoSQL) Knowledge of SDLC and Agile development practices Proficiency in modeling tools such as ERwin, PowerDesigner, or similar Preferred Skills: Experience with data integration tools and ETL processes Knowledge of data governance and compliance standards Familiarity with cloud platforms (AWS, Azure, GCP) and how they impact data architectureOverall Responsibilities Collaborate with business analysts, data engineers, and stakeholders to understand data requirements and translate them into robust data models Design logical and physical data models optimized for performance, scalability, and maintainability Develop and maintain documentation for data structures, including data dictionaries and metadata Conduct reviews of data models and code to ensure adherence to quality standards and best practices Assist in designing data security and privacy measures in alignment with organizational policies Stay informed about emerging data modeling trends and incorporate best practices into project delivery Support data migration, integration, and transformation activities as needed Provide technical guidance and mentorship related to data modeling standardsTechnical Skills (By Category) Data Modeling & Data Management: EssentialLogical/physical data modeling, ER diagrams, data dictionaries PreferredDimensional modeling, data warehousing, master data management Programming Languages: PreferredSQL (expertise in writing complex queries) OptionalPython, R for data analysis and scripting Databases & Data Storage Technologies: EssentialRelational databases (e.g., Oracle, SQL Server, MySQL) PreferredNoSQL (e.g., MongoDB, Cassandra), cloud-native data stores Cloud Technologies: PreferredBasic understanding of cloud data solutions (AWS, Azure, GCP) Frameworks & Libraries: Not typically required, but familiarity with data integration frameworks is advantageous Development Tools & Methodologies: EssentialData modeling tools (ERwin, PowerDesigner), version control (Git), Agile/Scrum workflows Security & Compliance: Knowledge of data security best practices, regulatory standards like GDPR, HIPAAExperience Minimum of 8+ years of direct experience in data modeling, data architecture, or related roles Proven experience designing data models for complex systems across multiple platforms (cloud, mobile, IoT, blockchain) Experience working in Agile environments using tools like JIRA, Confluence, Git Preference for candidates with experience supporting data governance and data quality initiativesNoteEquivalent demonstrated experience in relevant projects or certifications can qualify candidates.Day-to-Day Activities Participate in daily stand-ups and project planning sessions Collaborate with cross-functional teams to understand and analyze business requirements Create, review, and refine data models and associated documentation Develop data schemas, dictionaries, and standards to ensure consistency Support data migration, integration, and performance tuning activities Conduct peer reviews and provide feedback on data models and solutions Keep current with the latest industry developments in data architecture and modeling Troubleshoot and resolve data-related technical issuesQualifications Bachelors or Masters degree in Computer Science, Data Science, Information Technology, or related fields Demonstrated experience with data modeling tools and techniques in diverse technological environments Certifications related to data modeling, data management, or cloud platforms (preferred)Professional Competencies Strong analytical and critical thinking skills to develop optimal data solutions Effective communication skills for translating technical concepts to non-technical stakeholders Ability to work independently and in collaborative team environments Skilled problem solver able to handle complex data challenges Adaptability to rapidly evolving technologies and project requirements Excellent time management and prioritization skills to deliver quality outputs consistently
Posted 1 month ago
11.0 - 16.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Synechron is seeking a knowledgeable and proactive Data Modeler to guide the design and development of data structures that support our clients' business objectives. In this role, you will collaborate with cross-functional teams to translate business requirements into scalable and efficient data models, ensuring data accuracy, consistency, and integrity. You will contribute to creating sustainable and compliant data architectures that leverage emerging technologies such as cloud, IoT, mobile, and blockchain. Your work will be instrumental in enabling data-driven decision-making and operational excellence across projects.Software Required Skills: Strong understanding of data modeling concepts, methodologies, and tools Experience with data modeling for diverse technology platforms including cloud, mobile, IoT, and blockchain Familiarity with database management systems (e.g., relational, NoSQL) Knowledge of SDLC and Agile development practices Proficiency in modeling tools such as ERwin, PowerDesigner, or similar Preferred Skills: Experience with data integration tools and ETL processes Knowledge of data governance and compliance standards Familiarity with cloud platforms (AWS, Azure, GCP) and how they impact data architectureOverall Responsibilities Collaborate with business analysts, data engineers, and stakeholders to understand data requirements and translate them into robust data models Design logical and physical data models optimized for performance, scalability, and maintainability Develop and maintain documentation for data structures, including data dictionaries and metadata Conduct reviews of data models and code to ensure adherence to quality standards and best practices Assist in designing data security and privacy measures in alignment with organizational policies Stay informed about emerging data modeling trends and incorporate best practices into project delivery Support data migration, integration, and transformation activities as needed Provide technical guidance and mentorship related to data modeling standardsTechnical Skills (By Category) Data Modeling & Data Management: EssentialLogical/physical data modeling, ER diagrams, data dictionaries PreferredDimensional modeling, data warehousing, master data management Programming Languages: PreferredSQL (expertise in writing complex queries) OptionalPython, R for data analysis and scripting Databases & Data Storage Technologies: EssentialRelational databases (e.g., Oracle, SQL Server, MySQL) PreferredNoSQL (e.g., MongoDB, Cassandra), cloud-native data stores Cloud Technologies: PreferredBasic understanding of cloud data solutions (AWS, Azure, GCP) Frameworks & Libraries: Not typically required, but familiarity with data integration frameworks is advantageous Development Tools & Methodologies: EssentialData modeling tools (ERwin, PowerDesigner), version control (Git), Agile/Scrum workflows Security & Compliance: Knowledge of data security best practices, regulatory standards like GDPR, HIPAAExperience Minimum of 8+ years of direct experience in data modeling, data architecture, or related roles Proven experience designing data models for complex systems across multiple platforms (cloud, mobile, IoT, blockchain) Experience working in Agile environments using tools like JIRA, Confluence, Git Preference for candidates with experience supporting data governance and data quality initiativesNoteEquivalent demonstrated experience in relevant projects or certifications can qualify candidates.Day-to-Day Activities Participate in daily stand-ups and project planning sessions Collaborate with cross-functional teams to understand and analyze business requirements Create, review, and refine data models and associated documentation Develop data schemas, dictionaries, and standards to ensure consistency Support data migration, integration, and performance tuning activities Conduct peer reviews and provide feedback on data models and solutions Keep current with the latest industry developments in data architecture and modeling Troubleshoot and resolve data-related technical issuesQualifications Bachelors or Masters degree in Computer Science, Data Science, Information Technology, or related fields Demonstrated experience with data modeling tools and techniques in diverse technological environments Certifications related to data modeling, data management, or cloud platforms (preferred)Professional Competencies Strong analytical and critical thinking skills to develop optimal data solutions Effective communication skills for translating technical concepts to non-technical stakeholders Ability to work independently and in collaborative team environments Skilled problem solver able to handle complex data challenges Adaptability to rapidly evolving technologies and project requirements Excellent time management and prioritization skills to deliver quality outputs consistently
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Delhi / NCR
Hybrid
Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams.
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages EnglishC1 Advanced Seniority Senior
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Data Architect & Modeler Project Role Description Data Model, Design, build and lead the complex ETL data integration pipelines to meet business process and application requirements. Management Level :9Work Experience :6+ yearsWork Location :AnyMust have skills :Data Architecture Principles Good to have skills :Data Modeling, Data Architect, Informatica PowerCenter, Informatica Data Quality, SAP BusinessObjects Data Services, SQL, PL/SQL, SAP HANA DB, MS Azure, Python, ErWin, SAP Power Designer Job :Data Architect, Modeler, and data Integration LeadKey Responsibilities:1) Working on building Data models, Forward and Reverse Engineering.2) Working on Data and design analysis and working with data analysts team on data model design.3) Working on presentations on design, end to end flow and data models.4) Work on new and existing data models using Power designer tools and other designing tools like Visio5) Work with functional SMEs, BAs to review requirements, mapping documents Technical Experience:1) Should have good understanding of ETL design concepts like CDC, SCD, Transpose/ pivot, Updates, Validation2) Should have strong understanding of SQL concepts, Data warehouse concepts and can easily understand data technically and functionally.3) Good understanding of various file formats like xml, delimited, fixed width etc.4) Understand the concepts of data quality, data cleansing, data profiling5) Good to have Python and other new data technologies and cloud exposure.6) Having Insurance background is a plus. Educational Qualification :15 years of fulltime education with BE/B Tech or equivalent Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of relational and non-relational database design principles.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality frameworks.- Ability to translate business requirements into technical specifications. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Commercial PMO Operator Project Role Description : Plan and manage commercial deliverables for client accounts and help reduce overall project costs by improving efficiency and standardizing the processes throughout the contracts life. Assist commercial and/or account leadership in executing the commercial vision for the account. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Modelling, Data Entity-Relationship design and build, Schema setup Tools:ERWIN, DatabricksAs a Commercial PMO Operator, you will plan and manage commercial deliverables for client accounts, reduce project costs, and standardize processes. Assist in executing the commercial vision for the account, contributing to overall efficiency and success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Implement strategies to enhance project efficiency- Analyze and optimize commercial processes- Develop and maintain project cost reduction initiatives Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies- Strong understanding of project management principles- Experience in financial analysis and cost reduction strategies- Knowledge of commercial operations and contract management- Excellent communication and interpersonal skills Additional Information:- The candidate should have a minimum of 5 years of experience in Data Modeling Techniques and Methodologies- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough