Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a Bachelor's or Master's degree in computer/data science or related field, or equivalent technical experience. With a minimum of 7 years of hands-on experience in relational, dimensional, and/or analytic data modeling. Your expertise should include a strong command of SQL and practical experience working with databases such as Oracle, PostgreSQL, Snowflake, and Teradata. Your responsibilities will involve hands-on activities like modeling, design, configuration, installation, performance tuning, and sandbox Proof of Concept (POC). Proficiency in metadata management, data modeling, and related tools such as Erwin or ER Studio is essential. You should be experienced in data modeling, ER diagramming, and designing enterprise software for OLTP (relational) and analytical systems. It is crucial that you possess a solid understanding of data modeling principles, standard methodologies, semantic data modeling concepts, and multi-Fact models. You must be capable of defining data modeling standards, guidelines, and assisting teams in implementing complex data-driven solutions at a large scale globally. Your experience should also include supporting history handling, time series data warehousing, and data transformations through data modeling activities. Additionally, you should have the ability to quickly comprehend technological and business concepts, key domain entities, and communicate effectively with engineers, architects, and product management teams. Your role will involve assessing the accuracy, completeness, and consistency of data models while ensuring the maintenance of relevant documentation. Experience with data cataloging tools like Alation and Collibra to drive data lineage is preferred. A strong understanding of data governance processes and metadata repositories is also expected. You should be comfortable working in a fast-paced environment with short release cycles and an iterative development methodology, handling multiple projects simultaneously with minimal specifications. Having knowledge of Python and experience with Informatica would be considered advantageous for this position. Your excellent communication and documentation skills will be essential in this role.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You will be joining Coders Brain Technology Pvt. Ltd., a global leader in digital services and business solutions. As part of our team, you will work with clients to simplify, strengthen, and transform their businesses. We are committed to ensuring the highest levels of certainty and satisfaction by leveraging our comprehensive industry expertise and a global network of innovation and delivery centers. Your role will involve utilizing your expertise in AEM applications, including working with AEM Components and Templates, Workflows, Taxonomy, Metadata Management, Replication Strategies, Content Authoring, Versioning, Publishing Pages, Tagging, and JCR/CRX Repository Concepts Nodes and Properties. Additionally, React knowledge is a must-have for this position. If you are excited about this opportunity, please provide the following details: - Current CTC - Expected CTC - Notice period - Current Company - Current Location - Preferred Location - Total experience - Relevant experience - Highest Qualification - Date of Joining (If you have an offer from another company) - Offer in Hand Please ensure to share your updated CV at the earliest. Skills required for this role include tagging, CRX, JCR, AEM applications, AEM components and templates, publishing pages, React, workflows, content authoring, properties, metadata management, JCR/CRX repository concepts nodes, taxonomy, versioning, and replication strategies.,
Posted 1 month ago
1.0 - 4.0 years
9 - 14 Lacs
Pune
Work from Office
Your Team Responsibilities Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference Market & other critical datapoints to various products of the firm, The platform, hosted on firmsdata centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7 With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress, To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure, The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers, Your Key Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance, ? Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems, ? Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog, ? Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support, ? Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs, ? Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health, Leverage AI tools for automation of cataloging activities ? Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency, Your Skills And Experience That Will Help You Excel Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms ( e-g , Collibra, Alation, DataHub), Metadata & Lineage: Understanding of metadata management and data lineage, Scripting: Proficient in SQL and Python for automation and integration, APIs & Integration: Ability to connect catalog tools with data sources using APIs, Cloud Knowledge: Familiar with cloud data services (Azure, GCP), Data Governance: Basic knowledge of data stewardship, classification, and compliance, Collaboration: Strong communication skills to work across data and business teams About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing, Flexible working arrangements, advanced technology, and collaborative workspaces, A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results, A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients, Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development, Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles, We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Womens Leadership Forum, At MSCI we are passionate about what we do, and we are inspired by our purpose to power better investment decisions Youll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry, MSCI is a leading provider of critical decision support tools and services for the global investment community With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process, MSCI Inc is an equal opportunity employer It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability Assistance@msci and indicate the specifics of the assistance needed Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries, To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes Please do not forward CVs/Resumes to any MSCI employee, location, or website MSCI is not responsible for any fees related to unsolicited CVs/Resumes, Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers Read our full note on careers msci Show
Posted 1 month ago
6.0 - 11.0 years
11 - 20 Lacs
Bengaluru
Work from Office
Role & responsibilities Job Description: As a Data Governance Architect, you must be able to manage organization-wide data governance activities and will be responsible for improving the quality and managing the protection of sensitive data and information assets. You will be responsible for preparing a Data Catalog strategy to build out a catalog with data and BI objects and onboard a user base to support the curation of metadata, lineage, and documentation for enabling seamless data discovery at an enterprise level, thereby streamlining data intake, and reducing data duplication throughout the organization. You must be result-oriented, self-motivated and can thrive in a fast-paced environment. This role requires you to serve as a point of escalation for governance, data quality and protection issues and will work closely with Business and Functional area leadership to improve the quality and value of core data assets, respond to regulatory protection requirements as well as support the strategic requirements of the department. Primary Roles and Responsibilities: Looking for a Data Governance expert for the development of a metadata management system solution. Should be able to streamline the curation of metadata with custom scripts to upload available metadata to the API to achieve a deeper understanding of their catalog content and user base using custom dashboards to track adoption. Responsible for the implementation and oversight of the Company data management goals, standards, practices, process, and technologies. Experience in establishing data connections for relevant schemas, defining data stewards role & responsibilities for the scope of the data catalog. Define roles and responsibilities related to data governance and ensure clear accountability for stewardship of the companys principal information assets To properly onboard the data catalog, you should be able to conduct a data domain team assessment, discover the availability & completeness of each teams metadata, and develop a process for working with and onboarding data domain teams. Be the point of contact for Data Governance queries, including escalation point for client concerns. Coordinate the resolution of data integrity gaps by working with the business owners and IT. Ability to work in an agile environment with an iterative approach to development. Skills and Preferred candidate profile Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 4+ years' experience in Data Cataloging & Data Governance projects. Programming skills (sufficient to write SQL queries to validate test results in DW database).
Posted 1 month ago
5.0 - 9.0 years
8 - 16 Lacs
Pune
Hybrid
Job description Company: Kiya.ai Work location Pune Work Mode: Hybrid Job details- Key Responsibilities: o Utilize Collibra to manage and maintain data governance workflows, metadata, and data cataloging. o Support the implementation and enhancement of data lineage within Collibra to provide clear visibility into data flow and dependencies. o Collaborate with data engineers, architects, and business stakeholders to document and validate data lineage across systems. o Analyze and map data movement from source to target systems ensuring accuracy and completeness. o Gather, review, and analyze business and technical requirements related to data governance and lineage projects. o Translate requirements into actionable tasks and ensure alignment with organizational data policies. o Take full ownership of assigned deliverables, ensuring timely and quality completion. o Track progress, identify risks, and proactively communicate status to stakeholders. o Manage end-to-end contract execution processes related to data governance initiatives, including vendor coordination, compliance checks, and documentation. o Ensure contractual obligations are met and escalate issues as necessary. o Work closely with cross-functional teams including IT, legal, compliance, and business units to support data governance objectives. o Prepare and present reports, dashboards, and documentation to stakeholders. ________________________________________ Qualifications: • Masters degree in computer science, Information Systems, Data Science, or related field. • Proven experience working with Collibra or similar data governance platforms. • Strong understanding of data lineage concepts and metadata management. • Experience in requirement gathering, analysis, and documentation. • Demonstrated ability to manage deliverables and take ownership of projects. • Familiarity with contract management and execution processes is a plus. • Excellent analytical, problem-solving, and communication skills. • Ability to work independently and collaboratively in a fast-paced environment. ________________________________________ Preferred Skills: • Knowledge of SQL, data modeling, and ETL processes. • Experience with data quality tools and frameworks. • Understanding of regulatory requirements related to data governance (e.g., GDPR, CCPA). • Project management experience or certifications (e.g., PMP, Agile) would be an additional skill.
Posted 1 month ago
0.0 - 2.0 years
1 - 4 Lacs
Mumbai
Work from Office
Data Steward and Analyst Location: L&T Finance, Mahape, Navi Mumbai, India We are seeking a proactive and detail-oriented Data Steward and Analyst to drive metadata standardization and data ownership across enterprise systems This role will be responsible for defining, maintaining, and enriching metadata assets including business glossaries, data dictionaries, and ownership hierarchies The ideal candidate will collaborate with business and technical stakeholders to enhance data discoverability, trust, and compliance, Hands-on experience to implement, configure and maintain this Data Steward with Data Governance tools is essential, with preference given to those familiar with Ab Initio (preferred), Informatica, Talend, Atlan, Databricks, or Snowflake, Strong communication and collaboration skills are required to engage effectively with business and application stakeholders across L&T Finance, If you're passionate about building robust data governance frameworks and thrive in a dynamic environment, this is an excellent opportunity to advance your career, WHO WE ARE L&T Finance is one of Indias leading Non-Banking Financial Companies (NBFCs), known for its innovation-driven lending solutions across retail, rural, and infrastructure finance With a strong commitment to digital transformation and data-led decision making, we offer a dynamic workplace where your contributions shape the financial future of millions Join us to be a part of an organization that values growth, integrity, and impact, PROFESSIONAL SUMMARY Detail-oriented Data Steward / Analyst with 28 years of experience in metadata governance, business glossary management, and data ownership frameworks in financial services Adept at defining and maintaining enterprise metadata to support regulatory compliance, reporting accuracy, and data democratization Proven ability to work across business and technology teams; familiarity with Ab Initio is an added advantage, RESPONSIBILITIES Define, maintain, and govern enterprise-wide Business Glossary and Metadata Hub, ensuring consistent and standardized definitions across domains, Collaborate with Data Owners, SMEs, and Data Architects to assign data ownership, stewardship roles, and accountability for critical data assets, Support ongoing metadata curation and enrichment efforts, ensuring metadata completeness, quality, and alignment with business objectives, Establish workflows for metadata onboarding, change management, versioning, and lifecycle governance, Work closely with Data Governance, Risk, Compliance, and IT teams to support data classification, policy enforcement, and regulatory alignment, Assist in generating data lineage documentation and support catalogue integration efforts, Provide periodic reports and metrics on metadata completeness, glossary usage, and stewardship activities, Champion metadata awareness and drive data literacy initiatives across business units, Guide adoption and optimization of data governance tools like Ab Initio (Preferred), Informatica, Atlan, or Snowflake, Define roles and responsibilities for data stewards and custodians, Train and evangelize data governance practices across departments, Support audit and compliance reporting through accurate data lineage and usage logs, Ensure business continuity with proper data backup, archival, and purging strategies, Align governance goals with broader digital transformation and analytics strategies, TECHNICAL SKILLS Technical Skills: Strong understanding of Data Governance principles, metadata management, and data stewardship frameworks, Hands-on experience with Business Glossary, Metadata Hub, or any metadata management/collaboration tool ( e-g , Collibra, Informatica, Alation), Excellent documentation, communication, and stakeholder engagement skills, Working knowledge of data modelling, data lineage, and data quality concepts, Detail-oriented, self-motivated, and comfortable working across business and technology teams, Understanding of Indian regulatory compliance standards ( e-g , RBI guidelines, CKYC, Aadhaar, credit bureau data usage), Familiarity with data modelling, metadata management, and ETL/ELT pipelines, Proficient in querying and validating data using SQL or similar languages, Experience working in or with data warehouse environments like BigQuery, or GCP-based lakes, Shell Scripting, Python, JSON, Java 8 & above, Spring Boot, REST APIs, GCP, Debugging skills, Design, GEN AI, LLM, RAG, and Agentic RAG are not mandatory but, preferred, Collaboration & Communication Proven ability to collaborate with cross-functional teams including business, compliance, risk, and engineering Strong documentation and requirement gathering skills to align technical governance with business objectives Skilled in stakeholder management and articulating data governance value to non-technical audiences Experience driving adoption of governance practices across enterprise functions Personality Traits & Leadership Detail-oriented with a strong sense of data integrity and accuracy Self-driven, with the ability to take ownership and lead governance initiatives independently Process-oriented thinker with a structured approach to problem-solving Influencer with the ability to promote data discipline across teams Demonstrates professional integrity, especially while handling sensitive financial and customer data Adaptable to change and comfortable working in a fast-paced NBFC environment QUALIFICATION BE/B Tech and/or M Tech in any discipline 2-8years of familiarity or working knowledge of Ab Initio Metadata Hub and Ab Initio Business Glossary is a strong plus (but not mandatory) Exposure to cloud data platforms (AWS, GCP, or Azure) and modern data stacks is an advantage Strong problem-solving skills Good collaboration skills Good communication skills Show
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Data Engineer at our company, you will be responsible for designing and implementing Azure Synapse Analytics solutions for data processing and reporting. Your role will involve optimizing ETL pipelines, SQL pools, and Synapse Spark workloads to ensure efficient data processing. It will also be crucial for you to uphold data quality, security, and governance best practices while collaborating with business stakeholders to develop data-driven solutions. Additionally, part of your responsibilities will include mentoring a team of data engineers. To excel in this role, you should have 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Your expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes will be essential. Experience with Fabric is strongly desirable, and possessing strong leadership, problem-solving, and stakeholder management skills is crucial. Knowledge of Power BI, Python, or Spark would be a plus. You should also have deep knowledge of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and proficiency in writing complex SQL queries. Furthermore, you are expected to have knowledge and experience in Master Data/metadata management, including Data Governance, Data Quality, Data Catalog, and Data Security. Your ability to manage a complex and rapidly evolving business, actively lead, develop, and support team members will be key. As an Agile practitioner and advocate, you must be highly dynamic in your approach, adapting to constant changes in risks and forecasts. Your role will involve ensuring data integrity within the dimensional model by validating data and identifying inconsistencies. You will also work closely with Product Owners and data engineers to translate business needs into effective dimensional models. This position offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and receive competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. Ideally, you should hold a Bachelors/masters degree in software engineering, Computer Science, or a related area. Our company offers a range of benefits, including hybrid working arrangements, an annual performance-related bonus, Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture. MRI Software is dedicated to delivering innovative applications and hosted solutions that empower real estate companies to elevate their business. With a strong focus on meeting the unique needs of real estate businesses globally, we have grown to include offices across various countries with over 4000 team members supporting our clients. MRI is proud to be an Equal Employment Opportunity employer.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior Platform Engineer at Kenvue Data Platforms, you will have an exciting opportunity to be part of our growing Data & Analytics product line team. Your role involves collaborating closely with various teams such as Business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) to drive innovative data products for end users. You will play a key role in shaping the overall solution and data platforms, ensuring their stability, responsiveness, and alignment with business and cloud computing needs. Your expertise will be crucial in optimizing business outcomes and contributing to the growth and success of the organization. Your responsibilities will include providing leadership for data platforms in partnership with architecture teams, conducting proof of concepts to deliver secure and scalable platforms, staying updated on emerging technologies, mentoring other platform engineers, and focusing on the execution and delivery of reliable data platforms. You will work closely with Business Analytics leaders to understand business needs and create value through technology. Additionally, you will lead data platforms operations, build next-generation data and analytics capabilities, and drive the adoption and scaling of data products within the organization. To be successful in this role, you should have an undergraduate degree in Technology, Computer Science, applied data sciences, or related fields, with an advanced degree being preferred. You should possess strong analytical skills, effective communication abilities, and a proven track record in developing and maintaining data platforms. Experience with cloud platforms such as Azure, GCP, AWS, cloud-based databases, data streaming platforms, and Agile methodology will be essential. Your ability to define platforms tech stack, prioritize work items, and work effectively in a diverse and inclusive company culture will be critical to your success in this role. If you are passionate about leveraging data and technology to drive business growth, make a positive impact on personal health, and shape the future of data platforms, then this role at Kenvue Data Platforms is the perfect opportunity for you. Join us in our mission to empower millions of people every day through insights, innovation, and care. We look forward to welcoming you to our team! Location: Asia Pacific-India-Karnataka-Bangalore Function: Digital Product Development Qualifications: - Undergraduate degree in Technology, Computer Science, applied data sciences or related fields; advanced degree preferred - Strong interpersonal and communication skills, ability to explain digital concepts to business leaders and vice versa - 4 years of data platforms experience in Consumer/Healthcare Goods companies - 6 years of progressive experience in developing and maintaining data platforms - Minimum 5 years hands-on experience with Cloud Platforms and cloud-based databases - Experience with data streaming platforms, microservices, and data integration - Proficiency in Agile methodology within DevSecOps model - Ability to define platforms tech stack to address data challenges - Proven track record of delivering high-profile projects within defined resources - Commitment to diversity, inclusion, and equal opportunity employment,
Posted 1 month ago
6.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeler, you will be responsible for developing and maintaining conceptual, logical, and physical data models along with their corresponding metadata. Your role will involve performing data mapping based on data source schemas and reverse engineering existing transformations from multiple source database systems on a cloud data platform to align with corporate standards. Additionally, you will conduct data analysis, capture data requirements, and collaborate with squad members and product owners to implement data strategies effectively. One of your key responsibilities will be to validate logical data models with business subject matter experts and work closely with the development team to ensure that all requirements are captured and reflected in the data model. You will also collaborate with the DBA team to design physical models that optimize performance. Active participation in metadata definition and management will be essential in this role. To excel in this position, you should be proficient in data modeling techniques using tools such as Erwin, ER/Studio, and Power Designer. A willingness to learn and strong communication skills are also important attributes for success in this role. If you have 6 to 9 years of experience, you can expect a salary of 18 L, while candidates with 9 to 12 years of experience can anticipate a salary of 24 L. This is an excellent opportunity to leverage your skills and expertise as a Data Modeler to contribute to the success of the organization.,
Posted 1 month ago
11.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
As the Data Governance Tooling & Lifecycle Management Lead at McDonald's Corporation in Hyderabad, you will play a crucial role in developing and implementing end-to-end strategies for data governance tooling and processes across the enterprise. Your responsibilities will include owning the architecture, implementation, and administration of enterprise data governance platforms, such as Collibra, and defining governance workflows, metadata curation, and policy enforcement processes. You will work closely with various teams to ensure data is governed, discoverable, and trusted throughout its lifecycle. In this role, you will be responsible for developing and implementing strategies for data lifecycle governance, from ingestion to archival and deletion, while ensuring compliance with regulations and business needs. You will also lead initiatives to automate and visualize end-to-end data lineage across source systems, pipelines, warehouses, and BI tools. Collaborating with legal, compliance, and security teams, you will define and enforce data access, classification, and privacy policies to support regulatory compliance frameworks. To be successful in this role, you should have at least 11 years of experience in data governance, metadata management, or data operations, with a minimum of 3 years of experience in owning enterprise tooling or lifecycle processes. Deep expertise in data governance platforms, metadata and lineage management, cloud platforms such as GCP and AWS, SQL, ETL/ELT pipelines, and compliance practices is required. You should also possess excellent project management and stakeholder communication skills, along with a degree in Data Management, Information Systems, Computer Science, or a related field. Preferred experience includes working in Retail or QSR environments managing governance across global data operations, exposure to data product ownership, and familiarity with APIs and automation scripts. Holding a current GCP Associates or Professional Certification would be an added advantage. This is a full-time, hybrid role based in Hyderabad, India, where you will collaborate with data stewards, engineers, and product teams to ensure governance tooling meets user needs and drives adoption. Your contributions will be vital in reporting on governance adoption, data quality KPIs, and policy coverage to senior leadership and data councils. If you are looking to join a dynamic team at the forefront of innovation in the fast-food industry, this role offers a unique opportunity to make a significant impact on McDonald's global data governance initiatives.,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Management Associate at Wells Fargo, you will be responsible for the collection and extraction of data to support business needs and future trends. Your role will involve analyzing, assessing, and testing data controls and systems to ensure quality and risk compliance standards are met. Additionally, you will perform data quality, metadata analysis, governance activities, and remediation tasks on a tactical and routine basis. You will provide support for communications by documenting requirements, design decisions, issue closures, and remediation updates. Monitoring data governance, data quality, and metadata policies, standards, tools, processes, and procedures will also be part of your responsibilities to maintain common data control. Supporting managers in executing tasks throughout the issue remediation life cycle for data issues and interacting with data provided by third-party data providers are key aspects of this role. Required qualifications for this position include at least 6 months of experience in Data Management, Business Analysis, Analytics, or Project Management. Equivalent experience demonstrated through work experience, training, military service, or education will also be considered. Desired qualifications involve developing and running reports from Enterprise Data Management Tools such as Tableau and Power BI, analyzing Metadata Quality Assurance findings, and collaborating with Subject Matter Experts to drive remediation efforts. Familiarity with Data Governance, Metadata Management, Data Lineage, Business Analysis, Data Quality, Business Glossary, Data Dictionary, and Basic SQL is preferred. You will be conducting data governance onboarding meetings with new stakeholders, designing and conducting analysis to identify and remediate data quality or integrity issues, and adhering to data governance standards and procedures. You will play a crucial role in monitoring data governance, data quality, and metadata policies to ensure data control and remediation for companywide data management functions. Supporting regulatory analysis and reporting requirements, recommending plans for assessing the quality of new data sources, documenting business or technical metadata, and working with clients to assess the current state of data quality within your assigned responsibility area are also part of the responsibilities in this role. Please note that the job posting may be closed early due to the volume of applicants. At Wells Fargo, we value diversity and encourage applications from all qualified candidates, including women, persons with disabilities, aboriginal peoples, and visible minorities. Accommodations for applicants with disabilities are available upon request during the recruitment process.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You are an experienced Informatica Master Data Management (MDM) Specialist with over 5 years of total experience, including 3-5 years of direct expertise in MDM design, development, support, and operations using the Informatica MDM tool suite. Your primary responsibility will be to implement and manage MDM services, data quality, and governance to ensure high-quality data across enterprise systems. In this role, you will lead the design, development, and deployment of Master Data Management (MDM) solutions, ensuring alignment with enterprise data strategies. You will leverage the Informatica MDM tool suite, including Data Controls, Data Director, and Data Quality, to deliver high-quality data management solutions. Your expertise will be crucial in configuring and optimizing Informatica MDM tools and utilities, such as Data Controls and Data Director, to support ongoing MDM processes. You will also be responsible for implementing Data Governance and Data Quality solutions to maintain enterprise-wide consistency, accuracy, and completeness of master data. Additionally, you will work on various data integration technologies like ETL, data replication, and data services to ensure the successful integration of master data across enterprise systems. Your strong business acumen will enable you to understand business requirements and apply MDM solutions that solve complex business problems, collaborating closely with business and IT teams. Your role will involve performing enterprise-level data analysis and providing data-focused systems integration solutions to ensure seamless integration of master data while identifying data quality gaps and implementing solutions to improve data accuracy. To qualify for this position, you should have 5+ years of overall experience in data management, with at least 3-5 years of direct experience in MDM design, development, support, and operations using the Informatica MDM tool suite. Strong proficiency in Informatica MDM tools and utilities, along with experience in Master Data Management (MDM) strategies, Metadata Management, Data Governance, and Data Quality solutions, is required. Excellent communication and collaboration skills are essential for working cross-functionally with technical teams and business stakeholders. Preferred qualifications include experience with large-scale data environments and providing enterprise-level data analysis and system integration solutions.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You are an experienced Data Modeller with specialized knowledge in designing and implementing data models for modern data platforms, specifically within the healthcare domain. Your expertise includes a deep understanding of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. Your role involves translating complex business requirements into efficient and scalable data models that support analytics and reporting needs. You will be responsible for designing and implementing logical and physical data models for Databricks Lakehouse implementations. Collaboration with business stakeholders, data architects, and data engineers is crucial to create data models that facilitate the migration from legacy systems to the Databricks platform while ensuring data integrity, performance, and compliance with healthcare industry standards. Key responsibilities include creating and maintaining data dictionaries, entity relationship diagrams, and model documentation. You will develop dimensional models, data vault models, and other relevant modeling approaches. Additionally, supporting the migration of data models, ensuring alignment with overall data architecture, and implementing data modeling best practices are essential aspects of your role. Your qualifications include extensive experience in data modeling for analytics and reporting systems, strong knowledge of dimensional modeling, data vault, and other methodologies. Proficiency in Databricks platform, Delta Lake architecture, healthcare data modeling, and industry standards is required. You should have experience in migrating data models from legacy systems, strong SQL skills, and understanding of data governance principles. Technical skills that you must possess include expertise in data modeling methodologies, Databricks platform, SQL, data definition languages, data warehousing concepts, ETL/ELT processes, performance tuning, metadata management, data cataloging, cloud platforms, big data technologies, and healthcare industry knowledge. Your knowledge should encompass healthcare data structures, terminology, coding systems, data standards, analytics use cases, regulatory requirements, clinical and operational data modeling challenges, and population health and value-based care data needs. Your educational background should include a Bachelor's degree in Computer Science, Information Systems, or a related field, with an advanced degree being preferred. Professional certifications in data modeling or related areas would be advantageous for this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Billigence is a boutique data consultancy with a global presence, dedicated to revolutionizing how organizations interact with data. By harnessing state-of-the-art technologies, we specialize in crafting and executing advanced Business Intelligence solutions that deliver significant value across various domains from process digitalization to Cloud Data Warehousing, Visualization, Data Science, Engineering, and Data Governance. Our headquarters are in Sydney, Australia, and we have established offices worldwide to assist clients in overcoming business obstacles, streamlining operations, and fostering the widespread adoption of an analytics-driven mindset. As a Senior Data Governance Consultant at Billigence India, you will spearhead the Data Governance division, ensuring the delivery of top-notch services. Your responsibilities will involve designing and implementing robust data governance strategies that uphold stringent standards of data quality, security, and regulatory adherence. Leveraging your expertise in Collibra, you will create governance frameworks that facilitate business expansion and align with regulatory mandates. In addition to leading projects, you will play a pivotal role in mentoring junior team members and actively contribute to the recruitment process to bolster the governance team. Your key responsibilities will include: - Leading the development and execution of data governance frameworks - Establishing and upholding data governance policies and standards across various business units - Collaborating with data owners and stakeholders to ensure alignment with organizational goals - Implementing data stewardship initiatives and data quality management protocols - Providing strategic counsel on data privacy, security, and regulatory compliance - Mentoring junior consultants to nurture their skills and expertise in data governance - Participating in candidate interviews to strengthen the governance team To excel in this role, you must possess: - Proficiency in Collibra and other data governance tools - Extensive experience in implementing data governance frameworks and best practices - Knowledge of data privacy regulations such as GDPR and CCPA, along with compliance requirements - Ideally, certifications from DAMA (Data Management Association) like CDMP (Certified Data Management Professional) - Familiarity with the evolving trends in AI and their implications for data governance frameworks - Strong communication and collaboration skills to engage effectively with senior stakeholders and cross-functional teams - Deep understanding of data governance principles encompassing data quality, stewardship, and metadata management - Demonstrated expertise in data governance within complex, large-scale environments - Experience in mentoring junior team members to foster their technical growth - Active involvement in interviewing and assessing potential team members to fortify the governance team In return for your contributions, we offer: - Hybrid/remote work setup for a balanced work-life integration - Competitive compensation package with performance bonuses - Fitness allowance to support your physical well-being - Referral bonus scheme - Coaching, mentoring, and buddy system for seamless onboarding during the probationary period - Certification opportunities throughout your tenure - Support for career growth, internal mobility, and advancement prospects - Team-building activities and networking events to foster a collaborative work culture.,
Posted 1 month ago
3.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Kaseyais the leading provider of complete IT infrastructure and security management solutions for Managed Service Providers (MSPs) and internal IT organizations worldwide powered by AI Kaseyas best-in-breed technologies allow organizations to efficiently manage and secure IT to drive sustained business success Kaseya has achieved sustained, strong double-digit growth over the past several years and is backed by Insight Venture Partners insightpartners ), a leading global private equity firm investing in high-growth technology and software companies that drive transformative change in the industries they serve, Founded in 2000, Kaseya currently serves customers in over 20 countries across a wide variety of industries and manages over 15 million endpoints worldwide To learn more about our company and our award-winning solutions, go to Kaseya and for more information on Kaseyas culture, Kaseya is not your typical company We are not afraid to tell you exactly who we are and our expectations The thousands of people that succeed at Kaseya are prepared to go above and beyond for the betterment of our customers, Role: Sr Data Engineer Location: Bangalore Onsite Duration: Full-Time Timings: 1-10 PM IST (Cab; Food & other facilities provided) Experience: 6-8 yrs Skills to crack: Snowflake, SQL, Python, ETL, Matillion / Informatica Desired Skills: PowerBI, CRM (Salesforce), ERP (Oracle) About the Role: We are seeking an experienced and proactive Senior Data Engineer to drive the design, development, and optimization of our enterprise data infrastructure, with a core focus on Snowflake, You will lead complex data integration projects from CRM (preferably Salesforce), ERP systems (such as Oracle), and other enterprise sources, This role will play a critical part in shaping our data platform, ensuring scalability, performance, and compliance to support advanced analytics and AI initiatives, Required Qualifications: Bachelors or Masters degree in computer science, Engineering, Information Systems, or a related technical field, 6+ years of hands-on experience as a Data Engineer, with a proven track record of delivering production-grade data solutions, Expertise in Snowflake: performance tuning, data modelling, security features, and best practices, Deep experience integrating complex systems like CRM (Salesforce) and ERP platforms (NetSuite, etc), Advanced proficiency in SQL, including optimization, CTEs, window functions, and complex joins, Strong experience with Python for building and orchestrating data pipelines, Expertise with data pipeline tools or custom-built ETL frameworks, Solid experience in cloud ecosystems (AWS, Azure, GCP) ? including storage, compute, and serverless services, Strong understanding of data governance, metadata management, and data cataloguing solutions, Certifications in Snowflake, Salesforce Data Architecture, or Cloud Architect certifications (AWS, Azure, GCP), Experience working with real-time data ingestion tools (Kafka, Kinesis, Pub/Sub), Knowledge of Data Lakehouse architecture and experience with Delta Lake, Apache Iceberg, or similar technologies, Familiarity with ML Ops or supporting Data Science initiatives is a plus, Responsibilities: Architect, develop, and optimize highly scalable, reliable, and secure data pipelines and workflows on Snowflake, Lead and oversee the integration of data from CRM (Salesforce), ERP platforms, and other third-party systems into the enterprise data warehouse, Define best practices for data modelling, data quality, security, and operational efficiency, Collaborate with cross-functional teams (Product, Engineering, BI, Data Science) to understand business needs and deliver comprehensive data solutions, Mentor and guide junior and mid-level data engineers through code reviews, technical guidance, and architectural discussions, Evaluate and recommend modern data tools, frameworks, and patterns ( e-g , ELT, Data Mesh, Data Vault modelling), Implement and manage orchestration tools and CI/CD pipelines for data engineering projects, Own monitoring, logging, and alerting strategies for data pipelines, ensuring high uptime and performance, Ensure compliance with data governance, privacy, and security standards (GDPR, HIPAA, SOX), Contribute to technical documentation, runbooks, and knowledge-sharing initiatives within the team, Soft Skills: Strong leadership capabilities with a hands-on attitude, Excellent problem-solving and decision-making abilities, Strong written and verbal communication skills ? able to explain technical concepts to business stakeholders, Ability to balance speed and quality, pushing for sustainable engineering excellence, A mindset of continuous learning and innovation, Join the Kaseya growth rocket ship and see how we are #ChangingLives ! Additional Information Kaseya provides equal employment opportunity to all employees and applicants without regard to race, religion, age, ancestry, gender, sex, sexual orientation, national origin, citizenship status, physical or mental disability, veteran status, marital status, or any other characteristic protected by applicable law, Show
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
delhi
On-site
As a Snowflake Solution Architect, you will be responsible for owning and driving the development of Snowflake solutions and products as part of the COE. Your role will involve working with and guiding the team to build solutions using the latest innovations and features launched by Snowflake. Additionally, you will conduct sessions on the latest and upcoming launches of the Snowflake ecosystem and liaise with Snowflake Product and Engineering to stay ahead of new features, innovations, and updates. You will be expected to publish articles and architectures that can solve business problems for businesses. Furthermore, you will work on accelerators to demonstrate how Snowflake solutions and tools integrate and compare with other platforms such as AWS, Azure Fabric, and Databricks. In this role, you will lead the post-sales technical strategy and execution for high-priority Snowflake use cases across strategic customer accounts. You will also be responsible for triaging and resolving advanced, long-running customer issues while ensuring timely and clear communication. Developing and maintaining robust internal documentation, knowledge bases, and training materials to scale support efficiency will also be a part of your responsibilities. Additionally, you will support with enterprise-scale RFPs focused around Snowflake. To be successful in this role, you should have at least 8 years of industry experience, including a minimum of 3 years in a Snowflake consulting environment. You should possess experience in implementing and operating Snowflake-centric solutions and proficiency in implementing data security measures, access controls, and design specifically within the Snowflake platform. An understanding of the complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools is essential. Strong skills in databases, data warehouses, data processing, as well as extensive hands-on expertise with SQL and SQL analytics are required. Familiarity with data science concepts and Python is a strong advantage. Knowledge of Snowflake components such as Snowpipe, Query Parsing and Optimization, Snowpark, Snowflake ML, Authorization and Access control management, Metadata Management, Infrastructure Management & Auto-scaling, Snowflake Marketplace for datasets and applications, as well as DevOps & Orchestration tools like Airflow, dbt, and Jenkins is necessary. Possessing Snowflake certifications would be a good-to-have qualification. Strong communication and presentation skills are essential in this role as you will be required to engage with both technical and executive audiences. Moreover, you should be skilled in working collaboratively across engineering, product, and customer success teams. This position is open in all Xebia office locations including Pune, Bangalore, Gurugram, Hyderabad, Chennai, Bhopal, and Jaipur. If you meet the above requirements and are excited about this opportunity, please share your details here: [Apply Now](https://forms.office.com/e/LNuc2P3RAf),
Posted 1 month ago
15.0 - 19.0 years
0 Lacs
pune, maharashtra
On-site
Some careers have more impact than others. If you're looking for further opportunities to develop your career, take the next step in fulfilling your potential right here at HSBC. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Data Technology Lead, Data Catalogue & Data Quality. The role is based in Pune/Hyderabad. The Opportunity: We are seeking a skilled and driven Data Technology Lead to join the CTO Data Technology team, focusing on enterprise Data Catalogue and Data Quality solutions. This role will be responsible for designing, delivering, and managing modern data control capabilities across the bank's global footprint. This leadership role is crucial in driving adoption of strategic metadata and quality platforms that enable data governance, regulatory compliance, and operational efficiency. The successful candidate will collaborate closely with Group Chief Data Offices (CDOs), Cybersecurity, Risk, Audit, and business-aligned technology teams to provide technical solutions. What you'll do: - - Lead the design, engineering, and global rollout of strategic Data Catalogue and Data Quality solutions. - Partner with Group Data Strategy and CDOs to define control requirements, metadata models, and quality rules frameworks. - Own the platform roadmap and architecture for metadata ingestion, automated data discovery, lineage tracing, and data quality scoring. - Ensure interoperability with broader data platform architecture. - Establish automation pipelines for data standards validation, issue detection, and remediation orchestration at scale. - Drive adoption across thousands of Systems of Record (SoRs), Reference Data Masters, and Data Platforms. - Ensure adherence to global and local data privacy, residency, and regulatory requirements (e.g., GDPR, BCBS 239, PIPL, etc.). - Deliver service-level management, system resilience, and operational excellence for critical metadata and quality services. - Drive continuous improvement and technology modernization, including migration to cloud-native or hybrid architectures. - Support internal audits, regulator reviews, and compliance reporting related to data controls. - Champion engineering best practices and foster a DevSecOps and agile delivery culture within the team. Requirements: - 15+ years of relevant experience in data technology or data governance roles, with leadership experience in large-scale metadata and data quality programs. - Deep expertise in enterprise metadata management (EMM), data lineage tooling, data quality frameworks, and cataloging systems (e.g., Collibra, PyDQ, Graph Databases). - Strong architectural understanding of distributed data platforms, cloud services (e.g., AWS, GCP, Azure), and real-time data pipelines. - Experience in regulatory data compliance (e.g., CCAR, BCBS 239), including working with second-line risk and audit teams. - Strong understanding of data modeling, data classification, data lifecycle management, and data integration technologies. - Leadership experience with matrixed global teams and large-scale platform delivery. - Excellent stakeholder engagement skills, able to communicate with both senior business and technical stakeholders. You'll achieve more when you join HSBC. [HSBC Careers Website](www.hsbc.com/careers) Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled Data Governance Engineer to take charge of developing and overseeing robust data governance frameworks on Google Cloud Platform (GCP). Your role will involve leveraging your expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure the implementation of high-quality, secure, and compliant data practices aligned with organizational objectives. With a minimum of 4 years of experience in data governance, data management, or data security, you should possess hands-on proficiency with Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, Dataproc, and Google Data Catalog. Additionally, a strong command over metadata management, data lineage, and data quality tools like Collibra and Informatica is crucial. A deep understanding of data privacy laws and compliance frameworks, coupled with proficiency in SQL and Python for governance automation, is essential. Experience with RBAC, encryption, data masking techniques, and familiarity with ETL/ELT pipelines and data warehouse architectures will be advantageous. Your responsibilities will include developing and executing comprehensive data governance frameworks with a focus on metadata management, lineage tracking, and data quality. You will be tasked with defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards using GCP-native services like IAM, DLP, and KMS. Managing metadata repositories using tools such as Collibra, Informatica, Alation, or Google Data Catalog will also be part of your role. Collaborating with data engineering and analytics teams to ensure compliance with regulatory standards like GDPR, CCPA, SOC 2, and automating processes for data classification, monitoring, and reporting using Python and SQL will be key responsibilities. Supporting data stewardship initiatives, optimizing ETL/ELT pipelines, and data workflows to adhere to governance best practices will also be part of your role. At GlobalLogic, we offer a culture of caring, emphasizing inclusivity and personal growth. You will have access to continuous learning and development opportunities, engaging and meaningful work, as well as a healthy work-life balance. Join our high-trust organization where integrity is paramount, and collaborate with us to engineer innovative solutions that have a lasting impact on industries worldwide.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As an experienced Data & Analytics Project Manager, you will play a crucial role in leading end-to-end execution of data and analytics projects. Your expertise in data integration, analytics, and cloud platforms such as AWS and Azure will be essential in ensuring seamless delivery. Collaborating with cross-functional teams, driving innovation, and optimizing data-driven decision-making will be key responsibilities in this role. Our projects utilize a variety of technologies including internal custom-built solutions, packaged software, ERP solutions, data warehouses, Software as a Service, cloud-based solutions, and BI tools. You will be responsible for leading project teams from initiation to close, delivering effective solutions that meet approved customer and business needs. Your accountability will lie in determining and delivering solutions within budget and schedule commitments while maintaining required quality and compliance standards. Your main focus will be on leading end-to-end project management for Data Engineering & Analytics initiatives. This will involve understanding and managing data pipeline development, DWH design, and BI reporting needs at a high level. Collaborating with technical teams on Snowflake-based solutions, ETL pipelines, and data modeling concepts will be crucial. Overseeing project timelines, risks, and dependencies using Agile/Scrum methodologies will ensure successful project delivery. Facilitating communication between stakeholders to ensure alignment on Data Engineering, Data Analytics, and Power BI initiatives will be a key aspect of your role. Your responsibilities will also include working with DevOps and engineering teams to streamline CI/CD pipelines and deployment processes. Supporting metadata management and data mesh concepts to ensure an efficient data ecosystem will be essential. Working closely with Data Engineers, BI Analysts, and Business Teams to define project scope, objectives, and success criteria will contribute to overall project success. Ensuring data governance, security, and compliance best practices are followed will be critical in maintaining data integrity and security. Key responsibilities will include overseeing the full lifecycle of data and analytics projects to ensure scope, quality, and timelines are met. Acting as the primary liaison with customers, architects, and internal teams to align on execution strategies will be crucial. Managing ETL pipelines, data warehousing, visualization tools (Tableau, Power BI), and cloud-based big data solutions will be part of your oversight. Identifying potential risks, scope changes, and mitigation strategies to ensure smooth project execution will be important. Guiding workstream leads, supporting PMO updates, and maintaining transparent communication with all stakeholders will be key in ensuring project success. Driving innovation and process enhancements in data engineering, BI, and analytics workflows will be essential for continuous improvement. To excel in this role, you should have at least 8 years of experience leading data and analytics projects. Strong expertise in data integration tools, ETL processes, and big data technologies is required. Hands-on experience with cloud platforms and visualization tools will be beneficial. Proven ability to mentor teams, manage stakeholders, and drive project success is crucial. Excellent communication skills, with the ability to engage both business and IT executives, are necessary. Possessing certifications such as PMP, Agile, or Data Analytics will be advantageous.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Collibra Data Governance Specialist, you will play a crucial role in leading and managing enterprise-level data governance initiatives using Collibra. Your deep expertise in Collibra's platform will be essential in configuring, integrating, developing workflows, and engaging stakeholders effectively. Your responsibilities will include implementing and maintaining data governance frameworks, ensuring data quality, and facilitating data stewardship across the organization. You will be tasked with the end-to-end implementation and administration of the Collibra Data Intelligence Platform. Designing and configuring the Collibra Operating Model, customizing workflows using BPMN and Collibra Workflow Designer, and integrating Collibra with various enterprise systems through APIs and connectors will be part of your daily activities. Collaboration with data stewards, data owners, and business users to establish and enforce data governance policies is crucial for success. Monitoring data quality rules, lineage, and metadata management, as well as providing training and support to both business and technical users on Collibra usage and best practices, will be key aspects of your role. Serving as a Collibra subject matter expert (SME) and advocate within the organization to promote data governance maturity will also be expected. Maintaining documentation and ensuring compliance with internal and external data governance standards will be part of your ongoing responsibilities. To excel in this role, you should possess at least 5 years of experience in data governance, metadata management, or data quality, with a minimum of 3 years of hands-on experience with Collibra. A strong understanding of data governance frameworks, data stewardship, and data lifecycle management is essential. Proficiency in Collibra APIs, BPMN, and scripting languages, along with experience in data cataloging, lineage, and business glossary in Collibra, will be advantageous. Familiarity with data platforms such as Snowflake, Azure, AWS, Informatica, or similar is preferred. Excellent communication and stakeholder management skills are necessary for effective collaboration with various teams. Holding Collibra Ranger or Solution Architect certification would be a plus. Experience in enterprise-level deployments of Collibra, knowledge of regulatory compliance (e.g., GDPR, HIPAA, CCPA), and a background in data architecture or data engineering are desirable qualifications for this role.,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for a skilled Data Governance Engineer to spearhead the development and supervision of robust data governance frameworks on Google Cloud Platform (GCP). You should have a deep understanding of data management, metadata frameworks, compliance, and security within cloud environments to ensure the adoption of high-quality, secure, and compliant data practices aligned with organizational objectives. The ideal candidate should possess: - Over 4 years of experience in data governance, data management, or data security. - Hands-on expertise with Google Cloud Platform (GCP) tools like BigQuery, Dataflow, Dataproc, and Google Data Catalog. - Proficiency in metadata management, data lineage, and data quality tools such as Collibra, Informatica. - Comprehensive knowledge of data privacy laws and compliance frameworks. - Strong skills in SQL and Python for governance automation. - Experience with RBAC, encryption, and data masking techniques. - Familiarity with ETL/ELT pipelines and data warehouse architectures. Your main responsibilities will include: - Developing and implementing comprehensive data governance frameworks emphasizing metadata management, lineage tracking, and data quality. - Defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards utilizing GCP-native services like IAM, DLP, and KMS. - Managing metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. - Collaborating with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. - Automating processes for data classification, monitoring, and reporting using Python and SQL. - Supporting data stewardship initiatives including the creation of data dictionaries and governance documentation. - Optimizing ETL/ELT pipelines and data workflows to adhere to governance best practices. At GlobalLogic, we offer: - A culture of caring that prioritizes inclusivity, acceptance, and personal connections. - Continuous learning and development opportunities to enhance your skills. - Engagement in interesting and meaningful work with cutting-edge solutions. - Balance and flexibility to help you integrate work and life effectively. - A high-trust organization committed to integrity and ethical practices. GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to world-renowned companies, focusing on creating innovative digital products and experiences. Join us to collaborate on transforming businesses through intelligent products, platforms, and services.,
Posted 1 month ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Development of Oracle TRCS module, from translating requirements to design and build and on to testing and training. Maintaining Metadata, Smart Push, Smart View reports and objects. Ability to design and create data forms, tax calculation business rules, and reports. Creation and optimization of Calculation Scripts/Rules based on the business requirements. Design and creation of automation using EPM Automate. Your Profile Experience in writing complex business rules & member formulas. Developing Data input forms & security design. Data Integrations with ERP systems. Working with Smart View. Configuring Oracle Tax Reconciliation Cloud. Experience in Oracle Tax provisioning implementation. Advanced analytical, diagnostic, and technical skills. Understanding of Corporate tax provisioning processes. Knowledge and ability to comprehend analytical models and their concepts from across a client's entire organization. Experience with the EPM automate scripting language. Excellent verbal and written communication skills. Experience in at least one successful implementation of TRCS. What You'll Love About Working Here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The responsibilities of the role involve designing and implementing Azure Synapse Analytics solutions for data processing and reporting. You will be required to optimize ETL pipelines, SQL pools, and Synapse Spark workloads while ensuring data quality, security, and governance best practices are followed. Collaborating with business stakeholders to develop data-driven solutions and mentoring a team of data engineers are key aspects of this position. To excel in this role, you should possess 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes is essential. Experience with Fabric is strongly desirable. Strong leadership, problem-solving, and stakeholder management skills are required. Additionally, knowledge of Power BI, Python, or Spark is a plus. Deep understanding of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and writing complex SQL queries are important competencies. Familiarity with Best Authorization and security practices for Azure components, Master Data/metadata management, and data governance is crucial. Being able to manage a complex and rapidly evolving business and actively lead, develop, and support team members is vital. An Agile mindset and the ability to adapt to constant changes in risks and forecasts are expected. Thorough knowledge of data warehouse architecture, principles, and best practices is necessary. Expertise in designing star and snowflake schemas, identifying facts and dimensions, and selecting appropriate granularity levels is also required. Ensuring data integrity within the dimensional model by validating data and identifying inconsistencies is part of the role. You will work closely with Product Owners and data engineers to translate business needs into effective dimensional models. Joining MRI Software offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and access competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. The ideal candidate should hold a Bachelor's/Master's degree in software engineering, Computer Science, or a related area. The benefits of this position include hybrid working arrangements, an annual performance-related bonus, 6x Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture at MRI Software. MRI Software is a company that delivers innovative applications and hosted solutions to empower real estate companies to enhance their business. With a flexible technology platform and an open and connected ecosystem, we cater to the unique needs of real estate businesses globally. With offices across various countries and a diverse team, we provide expertise and insight to support our clients effectively. MRI Software is proud to be an Equal Employment Opportunity employer.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
Join our team at TeamViewer, the leading provider of remote connectivity software. We are at the forefront of innovation, utilizing cutting-edge technologies such as Augmented Reality and Artificial Intelligence to drive digital transformation. Our diverse team of over 1,500 employees from various backgrounds and regions fosters a culture of inclusivity, where individual talents and perspectives are celebrated. We provide a dynamic work environment that encourages the growth of new ideas. Join us in shaping a better working world. Responsibilities Platform Operations & Administration - Assist in the setup and implementation project of the platform from an IT perspective - Configure and manage Salesforce objects, flows, permissions, and security settings - Supervise user access, security model (IAM), data privacy compliance (GDPR), and license usage - Establish internal platform standards and contribute to documentation in Confluence - Evaluate AppExchange apps and mobile enablement for future adoption - Manage environment strategy, metadata structure, and platform limits - Monitor and optimize license usage based on Salesforce licensing models and business requirements Release Management & Deployment - Support the technical delivery of the Salesforce implementation - Implement CI/CD workflows using SFDX and Azure DevOps Integration & Architecture - Coordinate sandbox strategies and ensure release readiness and deployment quality - Track metadata and configuration changes for smooth deployments - Ensure consistency and quality across development and production environments - Collaborate with integration teams on connecting Salesforce to Magento (eCommerce) and D365 FO (ERP) - Support API-based integrations and middleware, including Azure Integration Services Governance & Documentation - Define internal standards, naming conventions, and configuration rules - Document all implementation work for internal transition - Act as the internal technical representative in partnership with external delivery teams Collaboration & Tools - Collaborate with external Salesforce implementation teams and internal business/IT stakeholders - Utilize Azure DevOps for backlog, releases, and issue tracking - Collaborate through Confluence and Microsoft Office tools for alignment and reporting Professional Experience - Minimum of 3 years in Salesforce-related roles (Admin, Platform Engineer, Technical Consultant) - Hands-on experience with platform operations and release management in Salesforce - Exposure to integrated system environments, ideally Salesforce + Magento + D365 FO - Experience in managing Salesforce licenses and understanding licensing models Technical Skills - Strong knowledge of Salesforce Sales Cloud and CPQ - Proficiency in SFDX, Git, CI/CD tools, and Salesforce metadata management - Experience with DevOps Tools, such as Azure DevOps, for issue tracking and backlog management - Solid understanding of API integrations, middleware, and Azure Integration Services - Familiarity with metadata management, sandbox strategies, and data flow design - Experience with IAM, security standards, and platform compliance - Comfortable with Confluence for technical documentation and collaboration - Proficient in Microsoft Office, especially Excel, Outlook, PowerPoint, and Word Soft Skills & Communication - Analytical thinker with a structured, solution-oriented mindset - Ability to communicate technical topics clearly to both IT and business audiences - Confidence in working with internal and external stakeholders - Fluent in English,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |