Jobs
Interviews

100 Data Vault Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

25 - 40 Lacs

hyderabad, chennai, bengaluru

Hybrid

Role & responsibilities 10+ years of experience in Data space. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team

Posted 3 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

pune

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.-Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize data warehouse performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

bengaluru

Remote

Role & responsibilities JD: Primary skill: Datavault 2.0,SNOWFLAKE AND DBT Secondary skill: Data Modelling Responsibilities: Collaborate with business analysts and stakeholders to understand the business needs and data requirements Analyze business processes and identify critical data elements for modeling Conduct data profiling and analysis to identify data quality issues and potential data modeling challenges Develop conceptual data models to capture high-level business entities, attributes and relationships within the database structure Using Data Vault to create data models that details the different parts of a business such as hubs, satellites and links Build automated loading processes and patterns to minimize development expenses and operational costs Good knowledge of Dimensional Modeling and Design patterns Lead and manage data operations projects, demonstrating a strong experience in managing end-to-end data analytics workflow. Lead the data engineering team in designing, building, and maintaining Data Lake on Snowflake using ADF Analyze and interpret data from Snowflake data warehouse to identify key trends and insights. Document work processes and maintain clear documentation for future reference. Solid understanding of data modeling concepts, including dimensional and fact modeling. Bring hands-on experience from at least one implementation project related to reporting and visualization. Apply best practices for dimensional and fact modeling for optimal performance and scalability. Qualifications: 8+ years of experience in dimensional modeling and design patterns Proven experience in designing and developing data models that details business parts such as hubs, satellites and links Familiarity with Snowflake and designing the models on top of it Strong SQL skills for data retrieval and manipulation. Experience with data visualization best practices and principles. Ability to work independently and collaboratively within a team. Preferred candidate profile

Posted 5 days ago

Apply

13.0 - 17.0 years

0 Lacs

karnataka

On-site

You will be a key member of the technology team, working closely with treasury professionals to understand their needs and translate them into robust and scalable software solutions. The position available is for a Principal Software Engineer at Career Level P4 in the role of Assistant Vice President, with a hybrid work setup in Bangalore. You will be part of the Corporate Functions team responsible for all applications used by the Corporate Finance team, collaborating with various functions such as Accounting and Recon, Financial Planning and Analysis, Regulatory Reporting, Procurement, Treasury, and Taxation. Your primary focus will be on designing, developing, and maintaining data models that support Treasury operations, investment management, risk management, financial analysis, and regulatory compliance. Key responsibilities include creating and maintaining data models for Treasury operations, analyzing data to support decision-making, ensuring data quality and compliance with regulations, integrating Treasury data models with other financial systems, developing reports and dashboards, and providing technical support for deployed solutions. The ideal candidate should hold a Bachelor's degree in computer science with over 13 years of experience and a strong background in building data platforms, large-scale data modeling, SQL, Python, Java, and/or Spark. Experience with large-scale database technology and cloud certifications is advantageous. Strong communication, collaboration, problem-solving skills, and familiarity with Agile methodologies are preferred. This role requires building relationships with various teams within and outside the Corporate Functions department, emphasizing effective communication and collaboration for success. We are committed to providing an inclusive and accessible hiring process. If you require accommodations at any stage, please inform us for a seamless experience. First Citizens India LLP is an equal opportunity employer.,

Posted 5 days ago

Apply

6.0 - 10.0 years

8 - 13 Lacs

hyderabad

Work from Office

Job description Position Title: Data Engineer (Snowflake Lead) Experience: 7+ Years Shift Schedule: Rotational Shifts Location: Hyderabad Role Overview We are seeking an experienced Data Engineer with strong expertise in Snowflake to join the Snowflake Managed Services team. This role involves data platform development, enhancements, and production support across multiple clients. You will be responsible for ensuring stability, performance, and continuous improvement of Snowflake environments. Key Responsibilities Design, build, and optimize Snowflake data pipelines, data models, and transformations. Provide L2/L3 production support for Snowflake jobs, queries, and integrations. Troubleshoot job failures, resolve incidents, and perform root cause analysis (RCA). Monitor warehouses, tune queries, and optimize Snowflake performance and costs. Manage service requests such as user provisioning, access control, and role management. Create and maintain documentation, runbooks, and standard operating procedures. Required Skills & Experience 5+ years of hands-on experience in Snowflake development and support. Strong expertise in SQL, data modeling, and query performance tuning. Experience with ETL/ELT development and orchestration tools (e.g., Azure Data Factory). Familiarity with CI/CD pipelines and scripting (Python or PySpark). Strong troubleshooting and incident resolution skills. Preferred Skills SnowPro Core Certification. Experience with ticketing systems (ServiceNow, Jira). Hands-on experience with Azure cloud services. Knowledge of ITIL processes.

Posted 6 days ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

hyderabad, chennai, bengaluru

Hybrid

Are you ready to make a difference in Data Space? Looking for immediate joiners - only candidates available to join in September 2025 are eligible to apply. Job Title: Data Modeller & Architect Location: Bengaluru, Chennai, Hyderabad What do we expect? 6-12 years of experience in Data Modelling. Decent SQL knowledge Able to suggest modeling approaches for a given problem. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. Experience in contributing to proposals and RFPs Good experience in stakeholder management Decent communication and experience in leading the team Contact Amirtha (HR - Aram Hiring) - WhatsApp your resume to 8122080023 / amirtha@aramhiring.com Who is our client: Our Client is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. They offer full stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. They are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Their purpose is to provide certainty to shape a better tomorrow.Our client operates with 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of their team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence.We are a Great Place to Work-Certified (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. Our client have been ranked among the Best and Fastest Growing analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine. Curious about the role? What your typical day would look like? As an Engineer and Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights.On a typical day, you might Engage the clients & understand the business requirements to translate those into data models. Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. Use the Data Modelling tool to create appropriate data models Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Gather and publish Data Dictionaries. Ideate, design, and guide the teams in building automations and accelerators Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Use version control to maintain versions of data models. Collaborate with Data Engineers to design and develop data extraction and integration code modules. Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. Work with the client to define, establish and implement the right modelling approach as per the requirement Help define the standards and best practices Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. Coach team members, and review code artifacts. Contribute to proposals and RFPs External Skills And Expertise You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry. Kindly share your resume to amirtha@aramhiring.com / 8122080023

Posted 6 days ago

Apply

12.0 - 16.0 years

35 - 45 Lacs

hyderabad, chennai, bengaluru

Hybrid

Role - Data Architect - Data Modeling Exp - 12-16 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent FTE Client - Data Analytics Global Leader Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)

Posted 6 days ago

Apply

6.0 - 10.0 years

25 - 35 Lacs

hyderabad, chennai, bengaluru

Hybrid

Role - Data Modeler/Senior Data Modeler Exp - 6 to 9 Yrs Locs - Chennai, Hyderabad, Bengaluru, Delhi, Pune Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)

Posted 6 days ago

Apply

5.0 - 10.0 years

16 - 31 Lacs

gurugram, bengaluru

Hybrid

Role : Data Modeller Experience: 5-12 Years Location: Gurugram/Bangalore Notice Period: Immediate to 45 Days Your scope of work / key responsibilities: Build and maintain out of standards data models to report disparate data sets in a reliable, consistent and interpretable manner. Gather, distil, and harmonize data requirements and to design coherent Conceptual, logical and physical data models and associated physical feed formats to support these data flows. Articulate business requirements and build source-to-target mappings having complex ETL transformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysis and database design - Transactional and Dimensional data modelling. Normalize/ De-normalize data structures, introduce hierarchies and inheritance wherever required in existing/ new data models. Develop and implement data warehouse projects independently. Work with data consumers and data suppliers to understand detailed requirements, and to propose standardized data models. Contribute to improving the Data Management data models. Be an influencer to present and facilitate discussions to understand business requirements and develop dimension data models based on these capabilities and industry best practices. Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding of Master Data Management, Data Quality and Data Governance. Basic understanding of data visualization tools like SAS VA, Tableau Good understanding of implementing & architecting data solutions using the informatica, SQL server/Oracle Key Qualifications and experience: Extensive practical experience in Information Technology and software development projects of with at least 8+ years of experience in designing Operational data store & data warehouse. Extensive experience in any of Data Modelling tools Erwin/ SAP power designer etc. Strong understanding of ETL and data warehouse concepts processes and best practices. Proficient in Data Modelling including conceptual, logical, and physical data modelling for both OLTP and OLAP. Ability to write complex SQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination of solid business knowledge and technical expertise with strong communication skills. Demonstrate excellent analytical and logical thinking. Good verbal & written communication skills and Ability to work independently as well as in a team environment providing structure in ambiguous situation. Interested candidates can share their resume at divya@beanhr.com

Posted 1 week ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Experience:- Overall IT experience (No of years) - 9+- Data Modeling Experience - 5+- Data Vault Modeling Experience - 3+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Optimize data architecture for performance, scalability, and reliability.- Document models, data definitions, and metadata.- Support data governance, quality, and master data management initiatives.- Participate in code reviews, modeling workshops, and agile ceremonies (if applicable). Technical ExperienceGood to Have Skills: - 9+ year overall IT experience, 5+ years in Data Modeling and 3+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Hands-on experience in any Data Vault automation tool (e.g., Vault Speed, WhereScape, biGENIUS-X, dbt, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google Big Query).- Must be familiar with Data Architecture Principles.- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will be responsible for understanding business requirements, data mappings, create and maintain data models through different stages using data modeling tools and handing over the physical design/DDL scripts to the data engineers for implementations of the data models. Your role involves creating and maintaining data models, ensuring performance and quality or deliverables.Experience- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+Key Responsibilities:- Drive discussions with clients teams to understand business requirements, develop Data Models that fits in the requirements- Drive Discovery activities and design workshops with client and support design discussions- Create Data Modeling deliverables and getting sign-off - Develop the solution blueprint and scoping, do estimation for delivery project. Technical Experience:Must Have Skills: - 7+ year overall IT experience with 3+ years in Data Modeling- Data modeling experience in Dimensional Modeling/3-NF modeling/No-SQL DB modeling- Should have experience on at least one Cloud DB Design work- Conversant with Modern Data Platform- Work Experience on data transformation and analytic projects, Understanding of DWH.- Instrumental in DB design through all stages of Data Modeling- Experience in at least one leading Data Modeling Tools e.g. Erwin, ER Studio or equivalentGood to Have Skills: - Any of these add-on skills - Data Vault Modeling, Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration- Must be familiar with Data Architecture Principles.Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E. or B.Tech. must Qualification 15 years full time education

Posted 1 week ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Data Architecture Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Data ArchitectKemper is seeking a Data Architect to join our team. You will work as part of a distributed team and with Infrastructure, Enterprise Data Services and Application Development teams to coordinate the creation, enrichment, and movement of data throughout the enterprise.Your central responsibility as an architect will be improving the consistency, timeliness, quality, security, and delivery of data as part of Kemper's Data Governance framework. In addition, the Architect must streamline data flows and optimize cost management in a hybrid cloud environment. Your duties may include assessing architectural models and supervising data migrations across IaaS, PaaS, SaaS and on premises systems, as well as data platform selection and, on-boarding of data management solutions that meet the technical and operational needs of the company. To succeed in this role, you should know how to examine new and legacy requirements and define cost effective patterns to be implemented by other teams. You must then be able to represent required patterns during implementation projects. The ideal candidate will have proven experience in cloud (Snowflake, AWS and Azure) architectural analysis and management. ResponsibilitiesDefine architectural standards and guidelines for data products and processes. Assess and document when and how to use existing and newly architected producers and consumers, the technologies to be used for various purposes, and models of selected entities and processes. The guidelines should encourage reuse of existing data products, as well as address issues of security, timeliness, and quality. Work with Information & Insights, Data Governance, Business Data Stewards, and Implementation teams to define standard and ad-hoc data products and data product sets. Work with Enterprise Architecture, Security, and Implementation teams to define the transformation of data products throughout hybrid cloud environments assuring that both functional and non-functional requirements are addressed. This includes the ownership, frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations. Working with Data Governance and project teams to model and map data sources, including descriptions of the business meaning of the data, its uses, its quality, the applications that maintain it and the technologies in which it is stored. Documentation of a data source must describe the semantics of the data so that the occasional subtle differences in meaning are understood. Defining integrative views of data to draw together data from across the enterprise. Some views will use data stores of extracted data and others will bring together data in near real time. Solutions must consider data currency, availability, response times and data volumes, etc. Working with modeling and storage teams to define Conceptual, Logical and Physical data views limiting technical debt as data flows through transformations. Investigating and leading participation in POCs of emerging technologies and practices. Leveraging and evolving existing [core] data products and patterns. Communicate and lead understanding of data architectural services across the enterprise. Ensure a focus on data quality by working effectively with data and system stewards. QualificationsBachelors degree in computer science, Computer Engineering, or equivalent experience. A minimum of 3 years experience in a similar role. Demonstrable knowledge of Secure DevOps and SDLC processes. Must have AWS or Azure experience.Experience with Data Vault 2 required. Snowflake a plus.Familiarity of system concepts and tools within an enterprise architecture framework. Including Cataloging, MDM, RDM, Data Lakes, Storage Patterns, etc. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education

Posted 1 week ago

Apply

7.0 - 12.0 years

35 - 60 Lacs

pune, bengaluru, delhi / ncr

Work from Office

Data Modeller Permanent Chennai / Bangalore / Pune / Hyderabad / Delhi NCR Preferred candidate profile 7- 15+ years of experience in Data Modeling. Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) Real-time experience working in OLAP & OLTP database models (Dimensional models). Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. Eye to analyze data & comfortable with following agile methodology. Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) Required Skills: Strong experience in SQL (tables, Attributes, joins, queries), Strong DWH Skills, Rational/Dimensional/ER Modeling, Data Valut modeling, OLAP/OLTP, Facts and Dimensions, Normalisation (3nf), Data marts, Schemas, Keys, Dimension types, M:M relationship & Forward /Reverse Engineering etc. Good to Have: Data valut, Tool experience (Erwin, ER Studio, Visio, Power Designer) & Cloud etc. If Interested, kindly share me your updated profile along with below details. Full Name: Contact #: Email ID: Total experience: Key skills: Current Location: Relocation: Current Employer: CTC (Including Variable): ECTC: Notice Period: Holding any offer:

Posted 1 week ago

Apply

7.0 - 12.0 years

3 - 7 Lacs

bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Data Architecture Good to have skills : Cloud Infrastructure Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Data ArchitectKemper is seeking a Data Architect to join our team. You will work as part of a distributed team and with Infrastructure, Enterprise Data Services and Application Development teams to coordinate the creation, enrichment, and movement of data throughout the enterprise. Your central responsibility as an architect will be improving the consistency, timeliness, quality, security, and delivery of data as part of Kemper's Data Governance framework. In addition, the Architect must streamline data flows and optimize cost management in a hybrid cloud environment. Your duties may include assessing architectural models and supervising data migrations across IaaS, PaaS, SaaS and on premises systems, as well as data platform selection and, on-boarding of data management solutions that meet the technical and operational needs of the company. To succeed in this role, you should know how to examine new and legacy requirements and define cost effective patterns to be implemented by other teams. You must then be able to represent required patterns during implementation projects. The ideal candidate will have proven experience in cloud (Snowflake, AWS and Azure) architectural analysis and management. Responsibilities Define architectural standards and guidelines for data products and processes. Assess and document when and how to use existing and newly architected producers and consumers, the technologies to be used for various purposes, and models of selected entities and processes. The guidelines should encourage reuse of existing data products, as well as address issues of security, timeliness, and quality. Work with Information & Insights, Data Governance, Business Data Stewards, and Implementation teams to define standard and ad-hoc data products and data product sets. Work with Enterprise Architecture, Security, and Implementation teams to define the transformation of data products throughout hybrid cloud environments assuring that both functional and non-functional requirements are addressed. This includes the ownership, frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations. Working with Data Governance and project teams to model and map data sources, including descriptions of the business meaning of the data, its uses, its quality, the applications that maintain it and the technologies in which it is stored. Documentation of a data source must describe the semantics of the data so that the occasional subtle differences in meaning are understood. Defining integrative views of data to draw together data from across the enterprise. Some views will use data stores of extracted data and others will bring together data in near real time. Solutions must consider data currency, availability, response times and data volumes, etc. Working with modeling and storage teams to define Conceptual, Logical and Physical data views limiting technical debt as data flows through transformations. Investigating and leading participation in POCs of emerging technologies and practices. Leveraging and evolving existing [core] data products and patterns. Communicate and lead understanding of data architectural services across the enterprise. Ensure a focus on data quality by working effectively with data and system stewards. QualificationsBachelor's degree in computer science, Computer Engineering, or equivalent experience. A minimum of 3 years' experience in a similar role. Demonstrable knowledge of Secure DevOps and SDLC processes. Must have AWS or Azure experience.Experience with Data Vault 2 required. Snowflake a plus.Familiarity of system concepts and tools within an enterprise architecture framework. Including Cataloging, MDM, RDM, Data Lakes, Storage Patterns, etc. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills.

Posted 1 week ago

Apply

5.0 - 9.0 years

12 - 15 Lacs

noida, chennai, delhi / ncr

Work from Office

Snowflake Developer Location: Chennai/ Noida Experience: 5 to 10 years Joining time: within 10 days Salary : 10-15 LPA Primary Skills: Snowflake, Data Vault, SQL, ETL Job Description: Strong experience in star-scheme concept and model information mart basing on given reporting requirements Strong experience in data vault concept and ability to build business vault basing on raw vault and integration / calculation requirements Expertise in SQL; Experience with large amounts of data, preferably in Snowflake Experience in Implementation of complex business logics in Business Vault or Information Mart Has Understanding source data transformations and reporting Ability to work with cross-functional teams and business users to translate business needs into technical solutions. Please share the following details along with the most updated resume to geeta.negi@compunnel.com if you are interested in the opportunity: Total Experience Relevant experience Current CTC Expected CTC Notice Period (Last working day if you are serving the notice period) Current Location SKILL 1 RATING OUT OF 5 SKILL 2 RATING OUT OF 5 SKILL 3 RATING OUT OF 5 (Mention the skill)

Posted 1 week ago

Apply

5.0 - 8.0 years

30 - 40 Lacs

pune

Work from Office

The role focuses on building scalable and secure data solutions using Snowflake, DBT, Azure Data Factory, and Azure services. Required Candidate profile The ideal candidate has a strong foundation in data modeling (Data Vault & Dimensional), data governance, and stakeholder management,

Posted 1 week ago

Apply

4.0 - 7.0 years

10 - 18 Lacs

noida, pune, bangalore rural

Hybrid

Role & responsibilities Were looking for candidates with strong technology and data understanding in data modelling space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team Your key responsibilities Employ tools and techniques used to understand and analyze how to collect, update, store and exchange data Define and employ data modeling and design standards, tools, best practices, and related development methodologies. Designs, reviews and maintains data models Performs the data analysis activities to capture data requirements and represent them in data models visualization Manages the life cycle of the data model from requirements to design to implementation to maintenance Work closely with the data engineers to create optimal physical data models of datasets. Identify areas where data can be used to improve business activities Skills and attributes for success Experience 3 - 7 Data modelling (relevant Knowledge) 3 years and above Experience in data modeling data tools including but not limited to Erwin Data Modeler, ER studio, Toad etc Strong knowledge in SQL Basic ETL skills to ensure implementation meets the documented specifications for ETL processes including data translation/mapping and transformation Good Datawarehouse knowledge Optional Visualisation skills Knowledge in DQ and data profiling techniques and tools Interested candidate can apply on the below link - https://careers.ey.com/job-invite/1611905/ Regards Aakriti Jain

Posted 1 week ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

pune, bengaluru, mumbai (all areas)

Hybrid

CitiusTech is hiring for Data Modellers, below is the required details. Required Skill: Data Modelling or Data Modeller, Data Warehouse, Data Vault. Total years of Experience:7 to 10 years. Relevant Years of Experience: minimum 3+ years. Work Location: Pune, Mumbai, Chennai, Bengaluru. Work Mode: Hybrid. Interview Mode: virtual. Interested candidates kindly share your updated resume to gopinath.r@citiustech.com. Regards, Gopinath R.

Posted 1 week ago

Apply

10.0 - 15.0 years

13 - 17 Lacs

bengaluru

Work from Office

Job Description Reuters Technology Architecture, part of Thomson Reuters, is seeking a Data Architect to lead the data architecture strategy for Reuters. This role is central to enabling Reuters to become a truly data-driven organization. Reporting to the Reuters Director of Architecture, the architect will work closely with the Thomson Reuters Data Architecture team and the Reuters Data & Analytics teams to ensure Reuters' data needs are met through scalable, secure, and governed solutions. About you Reuters Data Architecture Leadership: Own the end-to-end architecture for Reuters data systems, including modeling, organization, and transformation strategies. Platform Integration & Enhancement: Leverage and drive enhancements to the Thomson Reuters Data Platform and ingestion capabilities to support Reuters-specific use cases. Governance Alignment: Ensure data governance practices are aligned with both the Reuters Data Governance team and the Thomson Reuters Data Governance Group. Technical Leadership: Provide architectural guidance and mentorship to Reuters Data Engineering teams, ensuring delivery of high-quality, cost-effective data solutions. Hands-on Prototyping: Design and implement reusable frameworks and proof-of-concepts, particularly for data pipelines and transformation layers (e.g., using dbt). Cloud-Native Design: Architect solutions using AWS services (e.g., Glue, Kinesis, Lambda, S3) and Snowflake, optimizing performance, security, and scalability. Stakeholder Collaboration: Partner with business and product teams to define roadmaps, gather requirements, and deliver impactful data solutions. About the Role 10+ years of IT experience, with at least 5 years in a lead design or architectural role. Deep expertise in data modeling (relational, dimensional, NoSQL, Data Vault) and the data development lifecycle. Proven experience with cloud-native data platforms, especially AWS and Snowflake. Strong programming skills in Python or Java for data manipulation and automation. Familiarity with containerization (Docker) and orchestration (Kubernetes). Demonstrated ability to lead cross-functional teams and influence platform evolution. Strong understanding of data governance, security, and compliance (e.g., GDPR, CCPA). Collaborate with global teams across Thomson Reuters. Operate in a hybrid work environment with flexibility and autonomy. Contribute to a mission-driven organization that values truth, transparency, and innovation. Whats in it For You? Hybrid Work Model: Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated on industry trends and best practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 10.0 years

12 - 14 Lacs

indi

Work from Office

Hiring Data Engineering Architect/Lead with 5+ yrs exp (3+ in architecture/leadership) in DBT, DataVault (AutomateDV), Snowflake, Airflow & GitLab. Must have expertise in data pipelines & cloud solutions. Microservices, Salesforce, Agile a plus

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 15 Lacs

mumbai, pune, chennai

Work from Office

Type of work: Hybrid/Onsite (4 days/week) Domain: Banking Experience: 12-15 + Years Job Description: We are seeking a skilled Sr. Data Modeler to join our team and contribute to the design and implementation of scalable and efficient data models. The ideal candidate will have experience with Data Vault techniques, proficiency in Cloud Azure, and good to have working knowledge with Databricks. Data Modeler: Primary Skills: Data modeler,Cloud Azure,Data Vault Secondar Skills: Banking domain, Power Designer/ERWin/ER/Studio Job Description: We are seeking a skilled Sr. Data Modeler to join our team and contribute to the design and implementation of scalable and efficient data models. The ideal candidate will have experience with Data Vault techniques, proficiency in Cloud Azure, and good to have working knowledge with Databricks. Responsibilities: Data Modeling: o Design and develop data models using Data Vault methodologies to ensure robust, scalable, and flexible data architectures. o Create and maintain conceptual, logical, and physical data models to support business requirements. o Collaborate with stakeholders to understand data requirements and translate them into effective data models. o Implement best practices for data modeling, including normalization, denormalization, and data integration. Data Vault Implementation: o Utilize Data Vault techniques to build enterprise data lake and lake house and integrate disparate data sources. o Design and implement Hubs, Links, and Satellites to ensure comprehensive data capture and historical tracking. o Optimize data vault models for performance and scalability. Cloud Azure: o Design and deploy data models on Azure Data Services, including Azure SQL Database, Azure Data Lake, and Azure Synapse Analytics. o Ensure data security, compliance, and best practices in Azure environments. Collaboration and Communication: o Work closely with data engineers, data scientists, and business analysts to understand data needs and ensure data model alignment. o Provide technical guidance and support on data modeling best practices and Data Vault principles. o Document data models, data flows, and integration processes clearly and comprehensively. Data Modelling tools Azure: Data analysis and modeling tools (e.g. Power Designer, ERWin, ER/Studio)

Posted 2 weeks ago

Apply

7.0 - 12.0 years

7 - 12 Lacs

bengaluru

Work from Office

Primary Responsibilities: Understanding Business Requirements and translating them into low level design for implementation Estimate the effort required to execute the business requirements Development of intake pipelines to ingest relevant data into Magnus to support analytics and reporting Track and report status on progress of development activities in team forums such as stand-up meetings Participate in planning activities such as Feature Grooming and PI Planning Driving innovation through delivery focused on reducing processing costs, automation of CI/CD pipelines and effective utilization of Azure resources Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent experience 7+ years of IT experience Programming Languages - Solid experience with languages like Python for data analysis and application development Orchestration - Solid experience in using ADF and Databricks for orchestrating data workflows, data movement, and transformation Cloud Computing - Solid understanding of the Azure platform and their data services ETL (Extract-Transform-Load) - Expertise in ETL processes and tools like DBT (Data Build Tool) and DBT framework/packages (DBT Core) Database Management Systems (DBMS) - Expertise in relational databases (e.g., Microsoft SQL Server) and NoSQL databases Preferred Qualifications: Health Care experience Knowledge of Agile Scrum or Scaled Agile Knowledge of data security best practices, encryption, and access controls Understanding of data warehousing concepts, including data storage, retrieval, and optimization Understanding of the Data Vault methodology for scalable and flexible data warehousing Familiarity with Delta Lake, an open-table format that combines data lake flexibility with ACID transactions. Knowledge of the medallion architecture, which organizes data layers (Bronze, Silver, Gold) in a Lakehouse Proficiency in creating logical and physical data models to represent business requirements and system architecture

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

pune

Work from Office

Educational Requirements Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional Requirements: As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 2 weeks ago

Apply
Page 1 of 4
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies