Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Snowflake Data Warehouse, Oracle Procedural Language Extensions to SQL (PLSQL)Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models align with best practices and methodologies, facilitating discussions to gather requirements, and providing insights that drive data-driven decision-making across the organization. Your role will be pivotal in bridging the gap between technical and non-technical teams, ensuring that data is accurately represented and utilized effectively within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with Data Architecture Principles, Snowflake Data Warehouse, Oracle Procedural Language Extensions to SQL (PLSQL).- Strong understanding of relational and dimensional data modeling.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality principles. Additional Information:- The candidate should have minimum 12 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 7.0 years
11 - 15 Lacs
Gurugram
Work from Office
Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.
Posted 1 week ago
7.0 - 12.0 years
12 - 22 Lacs
Chennai
Remote
About Company: Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. Job Description: We are looking for an experienced Senior Data Modeler to join our agile team and support enterprise-level data initiatives. The ideal candidate will have a strong background in cloud-based data modeling , preferably within the Azure ecosystem , and be able to design and implement robust data models that support scalable and efficient data pipelines. Responsibilities: Design and implement conceptual, logical, and physical data models based on business needs. Work on Azure cloud technologies including Azure Data Lake , Azure Data Factory , and Dremio for data virtualization. Create and maintain Low-Level Design (LLD) documents, Unit Test Plans , and related documentation. Collaborate with data engineers, developers, and analysts to ensure accurate and scalable data modeling. Optimize data-related processes and adhere to coding and modeling standards. Conduct integration testing and support bug fixing throughout the SDLC. Participate in SCRUM ceremonies and work closely with onshore and offshore teams . Manage timelines, deliverables, and communicate blockers or tradeoffs proactively. Assist with documentation required for OIS clearance and compliance audits. Required Skills & Qualifications: Bachelors in Computer Science or related field. 8+ years of professional experience in data modeling (logical & physical). Strong expertise in SQL and experience with relational databases and data warehouses . Hands-on experience with data modeling tools like Erwin or equivalent. 5+ years working with Azure Data Lake , Azure Data Factory , and Dremio . Solid understanding of data structures , indexing , and optimization techniques . Performance tuning skills for models and queries in large datasets. Strong communication skills, both verbal and written. Highly organized, collaborative, and proactive team player. Benefits & Perks: Opportunity to work with leading global clients Flexible work arrangements with remote options Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities
Posted 1 week ago
3.0 - 6.0 years
5 - 9 Lacs
Pune
Work from Office
Your Role As a Data Modeler , you will play a critical role in designing and implementing robust data models that support enterprise data warehousing and analytics initiatives. You will Apply your strong foundation in data structures, algorithms, calculus, linear algebra, and machine learning to design scalable and efficient data models. Leverage your expertise in data warehousing concepts such as Star Schema, Snowflake Schema, and Data Vault to architect and optimize data marts and enterprise data warehouses. Utilize industry-standard data modeling tools like Erwin, ER/Studio, and MySQL Workbench to create and maintain logical and physical data models. Thrive in a fast-paced, dynamic environment, collaborating with cross-functional teams to deliver high-quality data solutions under tight deadlines. Demonstrate strong conceptual modeling skills, with the ability to see the big picture and design solutions that align with business goals. Exhibit excellent communication and stakeholder management skills, effectively translating complex technical concepts into clear, actionable insights for both technical and non-technical audiences. Your Profile Good knowledge and expertise on data structures and algorithms and calculus, linear algebra, machine learning and modeling. Experience in data warehousing concepts including Star schema, snowflake or data vault for data mart or data warehousing Experience using data modeling software like Erwin, ER studio, MySQL Workbench to produce logical and physical data models. Experience in working in a challenging , fast-paced environment Expertise in conceptual modelling; ability to see the big picture and envision possible solutions Excellent communication & stakeholder management skills. Experience in working in a challenging, fast-paced environment Excellent communication & stakeholder management skills. What youll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini
Posted 1 week ago
9.0 - 14.0 years
15 - 30 Lacs
Gurugram
Remote
Job description Data Modeler AI/ML Enablement Remote | Contract/Freelancer | Duration: 1 to 2 Months Start: Immediate | Experience: 8+ Years Were looking for experienced Data Modelers with a strong background in one or more industries: Telecom, Banking/Finance, Media or Government Only. Key Responsibilities: Design conceptual/logical/physical data models Collaborate with AI/ML teams to structure data for model training Build ontologies, taxonomies, and data schemas Ensure compliance with industry-specific data regulations Must-Have Skills & Experience: 7+ years of hands-on experience in data modeling conceptual, logical, and physical models. Proficiency in data modeling tools like Erwin, ER/Studio, or PowerDesigner. Strong understanding of data domains like customer, transaction, network, media, or case data. Familiarity with AI/ML pipelines understanding how structured data supports model training. Knowledge of data governance, quality, and compliance standards (e.g., GDPR, PCI-DSS). Ability to work independently and deliver models quickly in a short-term contract environment.
Posted 1 week ago
8.0 - 13.0 years
11 - 21 Lacs
Hyderabad, Pune, Chennai
Hybrid
Role & responsibilities Job Summary: We are looking for a highly skilled Senior Data Modeller with a strong foundation in data modeling concepts who is eager to expand into data engineering . This role is ideal for someone who has deep experience designing conceptual, logical, and physical data models and is looking to evolve into a more hybrid role with modern data engineering capabilities. 815 years of experience in data modelling with a strong understanding of relational and dimensional models. Experience with modeling tools (e.g., Erwin, Power Designer, dbt, SQLDBM, or similar). Proficiency in SQL and strong analytical thinking. Familiarity with metadata management, data catalogs, and lineage tracking tools. Strong communication and stakeholder management skills.
Posted 1 week ago
2.0 - 7.0 years
9 - 13 Lacs
Gurugram
Work from Office
We are looking for a skilled Database Specialist to join our team at Squareops, focusing on Cloud Infrastructure. The ideal candidate will have 2-7 years of experience in database management and cloud computing. Roles and Responsibility Design, implement, and manage databases for cloud infrastructure projects. Collaborate with cross-functional teams to identify and prioritize database requirements. Develop and maintain database documentation and technical specifications. Ensure data security, integrity, and compliance with industry standards. Troubleshoot and resolve complex database issues efficiently. Optimize database performance and scalability for large-scale applications. Job Requirements Strong knowledge of database management systems and cloud computing platforms. Experience with designing and implementing scalable and secure databases. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database development tools and technologies.
Posted 1 week ago
18.0 - 25.0 years
35 - 55 Lacs
Chennai
Work from Office
Architect scalable data solutions for BFSI Design data models for transactional systems, Lead data migration from legacy systems Experience with financial data modeling Proficiency in Erwin, PowerDesigner, or Toad Data Modeler.
Posted 2 weeks ago
8.0 - 11.0 years
16 - 20 Lacs
Hyderabad
Remote
US Shift(Night Shift) 8+ yrs in Data Modeling, 3+ yrs in ER Studio (ERwin not preferred), strong in relational & dimensional modeling, normalization. HR & EPM experience is a plus. Skilled in metadata, data dictionaries, documentation, communication.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Remote
Client is 5+years of experience in data modeling and a minimum of 3 years of proficiency using ER Studio.The ideal candidate will possess a deep understanding of data architecture, database design, and conceptual, logical, and physical data models.
Posted 2 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
Hyderabad
Hybrid
Work Mode: Hybrid(3days WFO & 2 days WFH) Role & responsibilities Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams. Preferred candidate profile Need 15 days or Immediate candidates .
Posted 2 weeks ago
5.0 - 10.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Project Role : Commercial Operator Project Role Description : Plan and manage commercial deliverables for client accounts and help reduce overall project costs by improving efficiency and standardizing the processes throughout the contracts life. Assist commercial and/or account leadership in executing the commercial vision for the account. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Modelling, Data Entity-Relationship design and build, Schema setup Tools:ERWIN, DatabricksAs a Commercial PMO Operator, you will plan and manage commercial deliverables for client accounts, reduce project costs, and standardize processes. Assist in executing the commercial vision for the account, contributing to overall efficiency and success. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Implement strategies to enhance project efficiency- Analyze and optimize commercial processes- Develop and maintain project cost reduction initiatives Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies- Strong understanding of project management principles- Experience in financial analysis and cost reduction strategies- Knowledge of commercial operations and contract management- Excellent communication and interpersonal skills Additional Information:- The candidate should have a minimum of 5 years of experience in Data Modeling Techniques and Methodologies- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,
Posted 2 weeks ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
As a Senior Data Modeller, you will be responsible for leading the design and development of conceptual, logical, and physical data models for enterprise and application-level databases. Your expertise in data modeling, data warehousing, and data governance, particularly in cloud environments, Databricks, and Unity Catalog, will be crucial for the role. You should have a deep understanding of business processes related to master data management in a B2B environment and experience with data governance and data quality concepts. Your key responsibilities will include designing and developing data models, translating business requirements into structured data models, defining and maintaining data standards, collaborating with cross-functional teams to implement models, analyzing existing data systems for optimization, creating entity relationship diagrams and data flow diagrams, supporting data governance initiatives, and ensuring compliance with organizational data policies and security requirements. To be successful in this role, you should have at least 12 years of experience in data modeling, data warehousing, and data governance. Strong familiarity with Databricks, Unity Catalog, and cloud environments (preferably Azure) is essential. Additionally, you should possess a background in data normalization, denormalization, dimensional modeling, and schema design, along with hands-on experience with data modeling tools like ERwin. Experience in Agile or Scrum environments, proficiency in integration, databases, data warehouses, and data processing, as well as a track record of successfully selling data and analytics software to enterprise customers are key requirements. Your technical expertise should cover Big Data, streaming platforms, Databricks, Snowflake, Redshift, Spark, Kafka, SQL Server, PostgreSQL, and modern BI tools. Your ability to design and scale data pipelines and architectures in complex environments, along with excellent soft skills including leadership, client communication, and stakeholder management will be valuable assets in this role.,
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Noida
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and industry standards, facilitating smooth data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies mainly data vault 2.0.- Good To Have Skills: Experience with data governance frameworks, snowflake warehouse/AWS- Strong understanding of relational and non-relational database systems.- Familiarity with data warehousing concepts and ETL processes.- Experience in using data modeling tools such as Erwin or IBM InfoSphere Data Architect. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at any location.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
8.0 - 13.0 years
15 - 30 Lacs
Gurugram
Remote
Job description Data Modeler AI/ML Enablement Remote | Contract/Freelancer | Duration: 1 to 2 Months Start: Immediate | Experience: 7+ Years Were looking for experienced Data Modelers with a strong background in one or more industries: Telecom, Banking/Finance, Media or Government Only. Key Responsibilities: Design conceptual/logical/physical data models Collaborate with AI/ML teams to structure data for model training Build ontologies, taxonomies, and data schemas Ensure compliance with industry-specific data regulations Must-Have Skills & Experience: 7+ years of hands-on experience in data modeling conceptual, logical, and physical models. Proficiency in data modeling tools like Erwin, ER/Studio, or PowerDesigner. Strong understanding of data domains like customer, transaction, network, media, or case data. Familiarity with AI/ML pipelines understanding how structured data supports model training. Knowledge of data governance, quality, and compliance standards (e.g., GDPR, PCI-DSS). Ability to work independently and deliver models quickly in a short-term contract environment.
Posted 2 weeks ago
9.0 - 14.0 years
25 - 40 Lacs
Chennai
Work from Office
Role & responsibilities We are seeking a Data Modeller with over 12+ years of progressive experience in information technology, including a minimum of 4 years in a Data migration projects to cloud(refactor, replatform etc) and 2 years exposer to GCP. Preferred candidate profile In-depth knowledge of Data Warehousing/Lakehouse architectures, Master Data Management, Data Quality Management, Data Integration, and Data Warehouse architecture. Work with the business intelligence team to gather requirements for the database design and model Understand current on-premise DB model and refactoring to Google cloud for better performance. Knowledge of ER modeling, big data, enterprise data, and physical data models designs and implements data structures to support business processes and analytics, ensuring efficient data storage, retrieval, and management Create a logical data model and validate it to ensure it meets the demands of the business application and its users Experience in developing physical Model for SQL, No SQL, Key-Value pair, document database like Oracle, BigQuery, spanner, Postgresql, firestore, mongo DB etc Understand the data needs of the company or client Collaborate with the development team to design and build the database model for both Application and Datawarehousing development Classify the business needs and build both MicroServices & Reporting Database Model Strong hands on experience in SQL, Database procedures Work with the development team to develop and implement phase wise migration plan, go existing of on-prem and cloud DB, Help determine and manage data cleaning requirements
Posted 2 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
Pune
Hybrid
Hi, Greetings! This is regarding a job opportunity for the position of Data Modeller with a US based MNC in Healthcare Domain. This opportunity is under the direct pay roll of US based MNC. Job Location: Pune, Mundhwa Mode of work: Hybrid (3 days work from office) Shift timings: 1pm to 10pm About the Company: The Global MNC is a mission-driven startup transforming the healthcare payer industry. Our secure, cloud-enabled platform empowers health insurers to unlock siloed data, improve patient outcomes, and reduce healthcare costs. Since our founding in 2017, we've raised over $81 million from top-tier VCs and built a thriving SaaS business. Join us in shaping the future of healthcare data. With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. As a result, these payers can ingest and manage all the information they need to transform their business by supporting their analytical, operational, and financial needs through our platform. Since our founding in 2017, it has built a highly successful SaaS business, raising more than $80 Million by leading VC firms with profound expertise in the healthcare and technology industries. We are solving massive complex problems in an industry ready for disruption. We're building powerful momentum and would love for you to be a part of it! Interview process: 5 rounds of interview 4 rounds of Technical Interview 1 round of HR or Fitment discussion Job Description: Data Modeller About the Role: Were seeking a Data Modeler to join our global data modeling team. Youll play a key role in translating business requirements into conceptual and logical data models that support both operational and analytical use cases. This is a high-impact opportunity to work with cutting-edge technologies and contribute to the evolution of healthcare data platforms. What Youll Do Design and build conceptual and logical data models aligned with enterprise architecture and healthcare standards. Perform data profiling and apply data integrity principles using SQL. Collaborate with cross-functional teams to ensure models meet client and business needs. Use tools like Erwin, ER/Studio, DBT, or similar for enterprise data modeling. Maintain metadata, business glossaries, and data dictionaries. Support client implementation teams with data model expertise. What Were Looking For 2+ years of experience in data modeling and cloud-based data engineering. Proficiency in enterprise data modeling tools (Erwin, ER/Studio, DBSchema). Experience with Databricks, Snowflake, and data lakehouse architectures. Strong SQL skills and familiarity with schema evolution and data versioning. Deep understanding of healthcare data domains (Claims, Enrollment, Provider, FHIR, HL7, etc.). Excellent collaboration and communication skills. In case you have query, please feel free to contact me on the below mention email or whatsapp or call. Thanks & Regards, Priyanka Das Email: priyanka.das@dctinc.com Contact Number: 74399 37568
Posted 2 weeks ago
12.0 - 15.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Project Role : Solution Architect Project Role Description : Translate client requirements into differentiated, deliverable solutions using in-depth knowledge of a technology, function, or platform. Collaborate with the Sales Pursuit and Delivery Teams to develop a winnable and deliverable solution that underpins the client value proposition and business case. Must have skills : Solution Architecture Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Solution Architect, you will engage in a dynamic and collaborative environment where you will translate client requirements into innovative and effective solutions. Your typical day will involve working closely with various teams to ensure that the solutions developed are not only deliverable but also align with the client's business objectives. You will leverage your extensive knowledge of technology and platforms to create value propositions that resonate with clients, ensuring that their needs are met with precision and creativity. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality outcomes that drive client satisfaction and business success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior team members to enhance their skills and knowledge in solution architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Solution Architecture.- Strong understanding of cloud computing platforms and services.- Experience with enterprise application integration and API management.- Ability to design scalable and resilient architectures.- Familiarity with agile methodologies and project management practices. Additional Information:- The candidate should have minimum 12 years of experience in Solution Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 10.0 years
5 - 8 Lacs
Gurugram
Remote
Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
7.0 - 10.0 years
5 - 8 Lacs
Noida
Remote
Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As an Architectural Designer, your primary responsibility will be to engage with Business Stakeholders to gather requirements and translate them into scalable and robust Solution Designs. This will involve Data Modelling, API Design, Infrastructure Design, and App Integration. Having experience in building multi-tenant applications will be advantageous for this role. You will be expected to follow a Cloud First Approach by designing and implementing cloud solutions using Azure native resources, including PaaS, SaaS, and IaaS. Collaboration is key in this role as you will closely work with cross-functional teams such as developers, product managers, and operations staff to ensure seamless integration and alignment with business objectives. Security and Compliance are critical aspects of the job, and you will need to ensure that Solution Designs meet compliance requirements by obtaining approval from the Enterprise Architecture Team and InfoSec team. Additionally, you will provide technical leadership by acting as the Subject Matter Expert during Requirements Gathering, Issue Triaging meetings, and 3rd Party Software evaluations. Key skills required for this role include strong hands-on experience in implementing Microservice based Solutions, expertise in Azure services like Azure Kubernetes Service (AKS), Azure Functions, Azure Blob Storage, Azure SQL, Azure APIM, and Application Gateways. A good understanding of OWASP Vulnerabilities, microservices architecture principles, containerization (e.g., Docker, Kubernetes), and API design is also necessary. Moreover, hands-on experience with design and modelling tools such as Visio, Draw.io, Lucidchart, and Erwin will be beneficial. Your key responsibilities will involve designing cloud-native architectures using Azure PaaS/IaaS (AKS, Functions, APIM, Blob, SQL, etc.), leading API, data, and infrastructure design, ensuring security and compliance (OWASP, InfoSec), and collaborating with cross-functional teams while acting as a technical SME. Must-have qualifications for this role include hands-on experience with microservices and containerization (Docker, Kubernetes), strong expertise in Azure, and familiarity with architecture tools like Visio, Draw.io, Lucidchart. Additionally, having an Azure/Cloud Architect certification, exposure to EPC domain, and knowledge of TOGAF ADM would be considered nice-to-have for this position.,
Posted 2 weeks ago
4.0 - 8.0 years
7 - 11 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Title: Erwin Data Modeler, Insurance domain Location: Any Job Type: Full-Time | 2-11pm Shift Job Summary We are seeking a skilled and experienced Data Modeler with hands-on expertise in Erwin Data Modeling to join our team. The ideal candidate will have a strong background in data architecture and modeling, with a minimum of 4 years of relevant experience. Knowledge of the insurance domain is a significant plus. Key Responsibilities Design, develop, and maintain conceptual, logical, and physical data models using Erwin Data Modeler. Collaborate with business analysts, data architects, and developers to understand data requirements and translate them into data models. Ensure data models align with enterprise standards and best practices. Perform data analysis and profiling to support modeling efforts. Maintain metadata and documentation for data models. Support data governance and data quality initiatives. Participate in reviews and provide feedback on data models and database designs. Required Skills & Qualifications Strong understanding of data modeling concepts including normalization, denormalization, and dimensional modeling. Knowledge on any relational database will be an advantage. Familiarity with data warehousing and ETL processes. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities.
Posted 2 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
We are looking for an experienced Data Modelling professional, proficient in tools such as Erwin and ER/Studio. A strong understanding of Azure Databricks, Snowflake/Redshift, SAP HANA, and advanced SQL is required. Prior experience in leading teams is also preferred.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough