Jobs
Interviews

455 Metadata Management Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,

Posted 1 month ago

Apply

4.0 - 6.0 years

20 - 30 Lacs

Noida

Work from Office

We are seeking a skilled Data Engineer to lead the migration from Hive Catalog to Databricks Unity Catalog on Azure. The Data Engineer will own the end-to-end migration of metadata and access controls from Hive Catalog to Unity Catalog within the Azure cloud environment. The role demands strong expertise in data cataloging, metadata management, Azure cloud infrastructure, and security best practices. Roles and Responsibilities Analyze the existing Hive Catalog metadata, schema, and security configurations. Design and execute a robust migration plan to Unity Catalog with minimal disruption and data integrity assurance. Collaborate with Data Governance, Security, and Cloud Infrastructure teams to implement access controls and policies leveraging Azure Active Directory (AAD). Develop automation scripts and tools to support migration, validation, and ongoing management. Troubleshoot migration challenges and provide post-migration support. Document migration processes and train stakeholders on Unity Catalog capabilities. Integrate Unity Catalog with Azure native services such as Azure Data Lake Storage Gen2, Azure Key Vault, and Azure Active Directory for security and identity management. Optimize Azure resource utilization during migration and production workloads. Keep current with Azure Databricks Unity Catalog enhancements and Azure cloud best practices. trong knowledge of metadata management, data governance frameworks, and data cataloging. Proficient in SQL, Python, and scripting for automation. Hands-on experience with Azure Databricks, Apache Spark, and Azure cloud services including Azure Data Lake Storage Gen2, Azure Key Vault, and Azure Active Directory. In-depth understanding of Azure cloud infrastructure: compute (VMs, Azure Databricks clusters), storage, networking, and security components. Experience integrating data catalog solutions with Azure identity and access management (Azure AD, RBAC). Strong grasp of data security, IAM policies, and access control in Azure environments. Excellent analytical, problem-solving, and communication skills.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Governance Specialist, you will be responsible for defining and implementing data governance frameworks, policies, and standards to ensure effective management of data assets. Your role will involve managing metadata, data catalog, and lineage documentation, monitoring and enforcing data quality, classification, access control, and security measures. It will be crucial to ensure compliance with data protection laws such as GDPR, CCPA, and HIPAA. Collaboration with stakeholders including data owners, IT, and legal teams will be essential to embed governance practices within the organization. You will conduct audits, assess the effectiveness of governance processes, and report on key performance indicators. Leading data stewardship initiatives and providing training to enhance data literacy across the organization will also be part of your responsibilities. Your role will require expertise in designing and operating data governance programs, proficiency in metadata management, data catalog tools, and data lineage modeling. Strong SQL skills and familiarity with cloud platforms like AWS, Azure, and Databricks are essential. A solid understanding of data privacy regulations, security measures, and access controls is necessary. Experience with data quality tools, profiling, and remediation processes will be beneficial. Excellent communication skills are crucial for effective policy rollout and stakeholder engagement. You should be comfortable working in agile, cross-functional teams and may consider obtaining certifications such as CDMP, DGSP, or cloud governance credentials to enhance your qualifications.,

Posted 1 month ago

Apply

5.0 - 15.0 years

0 Lacs

karnataka

On-site

The role of Talend Developer and Architect at our company involves designing, developing, testing, and deploying integration processes using Talend. Your responsibilities will include collaborating with team members to understand requirements, coding, debugging, and optimizing code for performance. You will also be involved in maintaining documentation for processes and contributing to technological improvement initiatives. As a Talend Developer and Architect, you will design and develop robust data integration solutions using Talend Studio to meet business requirements. You will also be responsible for implementing data governance frameworks and policies, configuring Talend Data Catalog, managing metadata repositories, data quality rules, data dictionaries, and optimizing data pipelines for performance and scalability. To excel in this role, you should have a background in Computer Science, proficiency in Back-End Web Development and Software Development, strong programming skills with an emphasis on Object-Oriented Programming (OOP), and experience with ETL tools, particularly Talend. Excellent analytical and problem-solving skills, along with good communication and teamwork abilities, are essential. A Bachelor's degree in Computer Science, Information Technology, or a related field is required. You will work closely with data stewards, business analysts, data engineers, data scientists, and business stakeholders to understand and fulfill data integration requirements. If you are looking for a challenging opportunity to showcase your skills and contribute to the success of our organization, this role is perfect for you.,

Posted 1 month ago

Apply

7.0 - 15.0 years

0 Lacs

karnataka

On-site

As an expert in AI, Data Governance, and Metadata Management, you will play a key role in architecting, implementing, and maintaining enterprise data governance processes. Your responsibilities will include designing and implementing data quality management, metadata standardization, and stewardship processes across various data domains. You will be tasked with creating and enforcing data governance rules, setting up operational models, and managing stakeholder alignment across IT, legal, compliance, and business teams. Additionally, you will be responsible for supporting compliance with regulations such as GDPR, HIPAA, CCPA, and Indian regulations under the Digital Personal Data Protection Act. You will lead the technical team in delivering roadmap milestones, monitoring Data Governance Key Performance Indicators (KPIs), and guiding the adoption of governance tools and processes. To excel in this role, you should have 7-15 years of experience in Data Governance, Data Quality, or Metadata Management roles. Hands-on experience with key Data Governance platforms such as Collibra, Informatica, and Alation is essential. Strong technical skills in scripting (Python, SQL), metadata/configure pipelines, automation, data engineering, and ETL/ELT flows are required. Experience in establishing and monitoring data quality metrics, conducting data profiling, audits, and root-cause analysis is crucial. Moreover, familiarity with cloud environments and integration practices (AWS, Snowflake, databases), understanding of SDLC, Agile methods, Jira/Confluence, DevOps, and data platform operations are necessary. Exceptional stakeholder management, communication, and leadership abilities are vital for coordinating cross-functional teams. Educating stakeholders through workshops, training, and governance councils, as well as managing governance awareness campaigns, will be part of your responsibilities. Joining Narwal offers you the opportunity to shape the future of a rapidly growing company. You will enjoy a competitive salary and benefits package, a supportive and inclusive company culture, and a collaborative and innovative work environment. Access to professional development and growth opportunities is provided, and Narwal is certified as a Great Place to Work. Narwal is an equal opportunity employer that celebrates diversity and is dedicated to creating an inclusive environment for all employees. To learn more, please visit: [Narwal Website](https://www.narwalinc.com/).,

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Bengaluru

Work from Office

About the Role We are seeking a Staff Software Engineer to lead Growth Data Platform initiatives for our client. This role is ideal for someone with strong hands-on experience, deep technical expertise, and a track record of delivering production-grade healthcare platform services on AWS. Youll work closely with MarTech, Growth Marketing, and Enrollment Marketing teams to design and develop scalable, compliant solutions. You will also mentor engineers, contribute to platform architecture, and champion CI/CD and DevOps best practices. Key Responsibilities Lead the design and architecture of scalable platform projects Streamline and maintain CI/CD pipelines for company applications Build and maintain high availability systems (target: 99.99% uptime) Collaborate across pods to tackle complex features and architecture upgrades Drive cost-effective AWS infrastructure automation and adoption Act as mentor and technical advisor, overseeing design and development practices Execute impactful changes to workflows and tools on a quarterly basis

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Noida, Pune, Bengaluru

Work from Office

Description: We are seeking a proficient Data Governance Engineer to lead the development and management of robust data governance frameworks on Google Cloud Platform (GCP). The ideal candidate will bring in-depth expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure high-quality, secure, and compliant data practices aligned with organizational goals. Requirements: 4+ years of experience in data governance, data management, or data security. Hands-on experience with Google Cloud Platform (GCP) including BigQuery, Dataflow, Dataproc, and Google Data Catalog. Strong command over metadata management, data lineage, and data quality tools (e.g., Collibra, Informatica). Deep understanding of data privacy laws and compliance frameworks. Proficiency in SQL and Python for governance automation. Experience with RBAC, encryption, and data masking techniques. Familiarity with ETL/ELT pipelines and data warehouse architectures. Job Responsibilities: Develop and implement comprehensive data governance frameworks , focusing on metadata management, lineage tracking , and data quality. Define, document, and enforce data governance policies, access control mechanisms, and security standards using GCP-native services such as IAM, DLP, and KMS. Manage metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. Collaborate with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. Automate processes for data classification, monitoring, and reporting using Python and SQL. Support data stewardship initiatives including the development of data dictionaries and governance documentation. Optimize ETL/ELT pipelines and data workflows to meet governance best practices. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 month ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Educational Requirements Master Of Business Adm.,Master Of Commerce,Master Of Engineering,Master Of Technology,Master of Technology (Integrated),Bachelor Of Business Adm.,Bachelor Of Commerce,Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional Requirements: At least 2 years of configuration & development experience with respect to implementation of OFSAA Solutions (such as ERM, EPM, etc.) Expertise in implementing OFSAA Technical areas covering OFSAAI and frameworks Data Integrator, Metadata Management, Data Modelling. Perform and understand data mapping from Source systems to OFSAA staging; Execution of OFSAA batches and analyses result area tables and derived entities. Perform data analysis using OFSAA Metadata (i.e Technical Metadata, Rule Metadata, Business Metadata) and identify if any data mapping gap and report them to stakeholders. Participate in requirements workshops, help in implementation of designed solution, testing (UT, SIT), coordinate user acceptance testing etc. Knowledge and experience with full SDLC lifecycle Experience with Lean / Agile development methodologies Preferred Skills: Technology-Oracle Industry Solutions-Oracle Financial Services Analytical Applications (OFSAA)

Posted 1 month ago

Apply

0.0 - 3.0 years

0 Lacs

hyderabad, telangana

On-site

As a Campaign Management New Associate at Accenture, you will play a crucial role in helping balance increased marketing complexity with diminishing marketing resources. Your primary focus will be to drive marketing performance by leveraging deep functional and technical expertise. You will be responsible for accelerating time-to-market and operating efficiencies at scale through Data and Technology, Next Generation Content Services, Digital Marketing Services & Customer Engagement, and Media Growth Services. Your role will involve the implementation of Digital Marketing Ads & Promotion creation/design, Design Web Content Management, and Commerce solutions for metadata management to enhance visibility in search engine results. We are looking for individuals with a strong background in Digital Marketing, who can perform effectively under pressure and demonstrate adaptability and flexibility. A commitment to quality, process-orientation, and excellent written and verbal communication skills are essential for this role. Familiarity with Google Ads will be advantageous. In this position, you will be expected to solve routine problems following established guidelines and protocols. Your interactions will primarily be within your team and with your direct supervisor. Detailed instructions will be provided for all tasks, and your decisions will be closely supervised as they directly impact your own work. As an individual contributor within a team, you will have a predetermined, narrow scope of work. Please note that this role may involve working in rotational shifts. If you are a recent graduate with a keen interest in Marketing Operations and Search Engine Marketing (SEM), and possess the necessary language abilities, this opportunity at Accenture could be the ideal starting point for your career journey.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Job Description: As a Data Modeler at PwC, you will play a crucial role in analyzing business needs, developing long-term data models, and ensuring the efficiency and consistency of data systems. Your expertise in data modeling, metadata management, and data system optimization will contribute to enhancing the overall performance of our data infrastructure. Key responsibilities include: - Analyzing and translating business needs into comprehensive data models. - Evaluating existing data systems and recommending improvements for optimization. - Defining rules to efficiently translate and transform data across various data models. - Collaborating with the development team to create conceptual data models and data flows. - Developing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility and efficiency. - Implementing data strategies and developing physical data models to meet business requirements. - Utilizing canonical data modeling techniques to enhance the efficiency of data systems. - Evaluating implemented data systems for variances, discrepancies, and optimal performance. - Troubleshooting and optimizing data systems to ensure seamless operation. Key expertise required: - Strong proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, PowerDesigner. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. - Familiarity with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Experience with ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 3 to 5 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, PowerDesigner, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,

Posted 1 month ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 24 Lacs

Kochi

Work from Office

Responsibilities: * Ensure data accuracy & compliance with regulatory standards. * Develop data strategy, governance & quality plans. * Manage metadata, stewardship & lineage. * Collaborate on enterprise-wide data initiatives. Remote work & Saudi Annual bonus Health insurance

Posted 1 month ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Gurugram

Work from Office

Overview We are seeking an experienced Data Modeller with expertise in designing and implementing data models for modern data platforms. This role requires deep knowledge of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. The ideal candidate will have a proven track record of translating complex business requirements into efficient, scalable data models that support analytics and reporting needs. About the Role As a Data Modeller, you will be responsible for designing and implementing data models for our Databricks-based Modern Data Platform. You will work closely with business stakeholders, data architects, and data engineers to create logical and physical data models that support the migration from legacy systems to the Databricks Lakehouse architecture, ensuring data integrity, performance, and compliance with healthcare industry standards. Key Responsibilities Design and implement logical and physical data models for Databricks Lakehouse implementations Translate business requirements into efficient, scalable data models Create and maintain data dictionaries, entity relationship diagrams, and model documentation Develop dimensional models, data vault models, and other modeling approaches as appropriate Support the migration of data models from legacy systems to Databricks platform Collaborate with data architects to ensure alignment with overall data architecture Work with data engineers to implement and optimize data models Ensure data models comply with healthcare industry regulations and standards Implement data modeling best practices and standards Provide guidance on data modeling approaches and techniques Participate in data governance initiatives and data quality assessments Stay current with evolving data modeling techniques and industry trends Qualifications Extensive experience in data modeling for analytics and reporting systems Strong knowledge of dimensional modeling, data vault, and other modeling methodologies Experience with Databricks platform and Delta Lake architecture Expertise in healthcare data modeling and industry standards Experience migrating data models from legacy systems to modern platforms Strong SQL skills and experience with data definition languages Understanding of data governance principles and practices Experience with data modeling tools and technologies Knowledge of performance optimization techniques for data models Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Professional certifications in data modeling or related areas Technical Skills Data modeling methodologies (dimensional, data vault, etc.) Databricks platform and Delta Lake SQL and data definition languages Data modeling tools (erwin, ER/Studio, etc.) Data warehousing concepts and principles ETL/ELT processes and data integration Performance tuning for data models Metadata management and data cataloging Cloud platforms (AWS, Azure, GCP) Big data technologies and distributed computing Healthcare Industry Knowledge Healthcare data structures and relationships Healthcare terminology and coding systems (ICD, CPT, SNOMED, etc.) Healthcare data standards (HL7, FHIR, etc.) Healthcare analytics use cases and requirements Optionally Healthcare regulatory requirements (HIPAA, HITECH, etc.) Clinical and operational data modeling challenges Population health and value-based care data needs Personal Attributes Strong analytical and problem-solving skills Excellent attention to detail and data quality focus Ability to translate complex business requirements into technical solutions Effective communication skills with both technical and non-technical stakeholders Collaborative approach to working with cross-functional teams Self-motivated with ability to work independently Continuous learner who stays current with industry trends What We Offer Opportunity to design data models for cutting-edge healthcare analytics Collaborative and innovative work environment Competitive compensation package Professional development opportunities Work with leading technologies in the data space This position requires a unique combination of data modeling expertise, technical knowledge, and healthcare industry understanding. The ideal candidate will have demonstrated success in designing efficient, scalable data models and a passion for creating data structures that enable powerful analytics and insights.

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Surat

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Varanasi

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Visakhapatnam

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.

Posted 1 month ago

Apply

5.0 - 12.0 years

0 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Familiarity with Data Management Standards Ability to work with high volumes of detailed technical & business metadata. Experience with documenting Data Element metadata (Business Elements vs. Technical Data Elements) Experience with understanding how data transformations materialize and determine appropriate controls required to ensure high-level of data quality. Ability to understand and document application and/or data element level flows (i.e., lineage). Ability to analyze both Process and Datasets to identity meaningful actionable outcomes. Understand and implement changes to business processes. Develop and influence business processes necessary to support data governance related outcomes. Manage and influence across vertical organizations to achieve common objectives. Intermediate to Expert level knowledge of MS products such as Excel, PowerPoint, Word, Skype, & Outlook Working knowledge of Metadata tools such as Collibra or equivalent. Familiarity with Data Analytics / BI tools such as Tableau, MicroStrategy etc. Communication Skills: Create both visually and verbally engaging informative materials for departmental leadership, business partners, executives, and stakeholders. Ability to tailor communication of topics to various levels of the organization (e.g., technical audiences vs. business stakeholders). Desired Skills (nice-to-have): General knowledge of Banking industry.

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Lucknow

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Ludhiana

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Patna

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Guwahati

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

We are looking for an experienced Data Governance Architect with deep expertise in Alation and Azure cloud platforms. This role involves partnering with senior stakeholders to define and champion an enterprise data catalog and dictionary strategy, oversee the entire lifecycle of the data catalog from establishing metadata standards and initial MVPs to executing full-scale enterprise rollouts. You should have at least 10 years of experience in data governance and proven expertise in the Alation tool on the Azure platform. Understanding of the Snowflake platform is also required. Additionally, you should have proven expertise in at least two areas such as Data Governance Operating Models, Metadata Management, Data Cataloging, Data Lineage, or Data Quality. A deep understanding of governance frameworks such as DAMA or DCAM, with practical implementation experience, is essential. In this role, you will be responsible for assessing current cataloging and dictionary capabilities, identifying gaps, and developing actionable roadmaps to enrich metadata quality, accelerate catalog population, and drive adoption. You will also need to identify different data personas using the data catalog and design persona-specific playbooks to promote adoption. Your responsibilities will include designing, deploying, and managing scalable data catalog and dictionary solutions using platforms like Alation. Understanding of leading Data Governance tools like Collibra and Purview will be beneficial. You will oversee the entire lifecycle of the data catalog, from establishing metadata standards and initial MVPs to executing full-scale enterprise rollouts. Furthermore, you will define architecture and best practices for metadata management to ensure consistency, scalability, and sustainability of the catalog and dictionary. You will identify and catalog critical data elements by capturing clear business terms, glossaries, KPIs, lineage, and persona-specific guides to build a trusted, comprehensive data dictionary. Developing and enforcing policies to maintain metadata quality, manage access, and protect sensitive information within the catalog will be part of your responsibilities. You will need to implement robust processes for catalog population, including automated metadata ingestion, leveraging APIs, glossary management, lineage tracking, and data classification. Moreover, you will develop a workflow management approach to notify changes to certified catalog content to stewards. Creating reusable frameworks and templates for data definitions and best practices to streamline catalog adoption across teams will also be expected from you.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled and detail-oriented Data Governance Analyst looking to join the Data Lakehouse program team of a Leading Insurance and Investments Firm in Pune (Hybrid; thrice a week in-office requirement). Your primary responsibility will be to ensure data integrity, quality, and compliance across the organization, focusing on Data Ownership/Stewardship, Metadata Management, Data Quality, and Reference Data Management. Your key responsibilities will include: Metadata Management: - Review and validate metadata documents and ingestion templates - Analyze and recommend improvements to data dictionaries, business glossaries, access controls, etc. - Ensure metadata accuracy and completeness across all data assets Data Ownership and Stewardship: - Collaborate with Data Owners and Stewards to align data governance standards with business requirements - Facilitate communication between technical teams and business stakeholders Data Quality: - Review and enforce data quality requirements - Develop data quality metrics and monitoring processes - Identify and address data quality issues in collaboration with relevant teams Reference Data Management: - Review and standardize reference data and Lists of Values - Ensure proper maintenance and version control of reference data - Collaborate with business units to define and implement reference data standards Cross-functional Collaboration: - Work closely with various teams like Business Systems Analysts, Data Architects, etc. - Participate in data governance meetings and initiatives - Contribute to the development and implementation of data governance policies and procedures Preferred Qualifications: - Professional certifications in data governance or data management - Experience with data lakehouse architectures and technologies - Familiarity with Agile methodologies and project management practices - Experience with data governance tools and applications Requirements: - Bachelor's degree in Computer Science, Information Systems, or related field - 5+ years of experience in data governance, data management, or similar role - Strong understanding of data governance principles, metadata management, and data quality concepts - Experience with data dictionaries, business glossaries, and data classification methodologies - Familiarity with insurance and investment industry data standards and regulations - Excellent analytical and problem-solving skills - Strong communication and interpersonal skills - Proficiency in data governance tools and technologies - Knowledge of data privacy regulations and best practices,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior dbt Engineer with a strong background in Snowflake and Azure cloud platforms. Your primary responsibility will be to lead the design and development of scalable, governed, and efficient data transformation pipelines using dbt. You will collaborate across functions to deliver business-ready data solutions. With at least 8 years of experience in data engineering, analytics engineering, or similar roles, you have proven expertise in dbt (Data Build Tool) and modern data transformation practices. Your advanced proficiency in SQL and deep understanding of dimensional modeling, medallion architecture, and ELT principles will be crucial for success in this role. You must have strong hands-on experience with Snowflake, including query optimization, and be proficient with Azure cloud services such as Azure Data Factory and Blob Storage. Your communication and collaboration skills should be exemplary, and you should also have familiarity with data governance, metadata management, and data quality frameworks. As a Senior dbt Engineer, your key responsibilities will include leading the design, development, and maintenance of dbt models and transformation layers. You will define and enforce data modeling standards, best practices, and development guidelines while driving the end-to-end ELT process to ensure reliability and data quality across all layers. Collaboration with data product owners, analysts, and stakeholders to translate complex business needs into clean, reusable data assets is essential. You will utilize best practices on Snowflake to build scalable and robust dbt models and integrate dbt workflows with orchestration tools like Azure Data Factory, Apache Airflow, or dbt Cloud for robust monitoring and alerting. Supporting CI/CD implementation for dbt deployments using tools like GitHub Actions, Azure DevOps, or similar will also be part of your responsibilities. If you are looking for a challenging opportunity to leverage your expertise in dbt, Snowflake, and Azure cloud platforms to drive digital transformation and deliver impactful data solutions, this role is perfect for you.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As the Data Governance Specialist, you will be responsible for developing and maintaining enterprise-wide data governance strategies, standards, and policies. Your role will involve aligning governance practices with business objectives such as regulatory compliance and analytics readiness. You will play a key part in defining roles and responsibilities within the governance operating model, driving governance maturity assessments, and leading change management initiatives. Collaboration will be a crucial aspect of your position, as you will work across IT, legal, business, and compliance teams to ensure alignment on governance priorities. You will be tasked with defining stewardship models, creating enablement programs, conducting training sessions, and developing communication programs to promote governance awareness within the organization. In terms of architecture design for data governance platforms, you will be responsible for designing scalable and modular data governance architecture. Your expertise will be utilized to evaluate tools such as Microsoft Purview, Collibra, Alation, BigID, and Informatica, ensuring seamless integration with metadata, privacy, quality, and policy systems. A key focus of your role will be on Microsoft Purview solution architecture. You will lead the end-to-end implementation and management of Microsoft Purview, configuring RBAC, collections, metadata scanning, business glossary, and classification rules. Additionally, you will implement sensitivity labels, insider risk controls, retention policies, data mapping, and audit dashboards. Your responsibilities will also include architecting metadata repositories and ingestion workflows, ensuring end-to-end lineage (ADF, Synapse, Power BI), and defining governance over business glossary and approval workflows. Your expertise in metadata, lineage, and glossary management will be essential in maintaining data integrity and accessibility within the organization.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies