Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 9 years
10 - 17 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Role & responsibilities Design, configure, and develop solutions within the Collibra Data Intelligence Cloud . Implement and maintain Collibra workflows using BPMN. Build and manage Collibra data models , domains, assets, and communities. Integrate Collibra with other enterprise systems (e.g., Snowflake, Informatica, Tableau, ServiceNow) via APIs or connectors. Ensure metadata is accurately captured, cataloged, and maintained. Collaborate with data stewards, data owners, and other stakeholders to define business glossaries and governance policies. Develop documentation, training materials, and provide user support. Participate in data governance and data quality initiatives across the organization. Monitor performance and troubleshoot issues with Collibra implementations. Please share your cv @ Ravina.m@vhrsol.com
Posted 1 month ago
9 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Experience: 9+ yrs Location – Bangalore/Chennai/Pune/Mumbai/Noida Rate- as per market availability of profiles Job Overview- Primary Skills: Collibra data governance, metadata management, and integrations, SQL, Linux/Windows Secondary Skills: AWS/Azure, Rest API, Data quality/lineage tools Exp: Total experience – 9+ years, relevant experience – 2+ years in Collibra application support or administration Job Location: BangaloreChennaiMumbaiNoidaPune
Posted 1 month ago
5 - 10 years
5 - 15 Lacs
Noida
Work from Office
Hiring with VLink India Pvt Ltd Experience in designing and implementing business process workflows using Collibra Workflow Designer. Understanding of Collibra Data Governance Center (DGC) and its modules, including Data Catalog, Business Glossary, and Policy Manager. Experience in metadata harvesting, lineage tracking, and governance to improve data visibility. Proficiency in using Collibra REST APIs for workflow automation, data exchange, and custom integrations with other tools. Familiarity with Collibra Data Quality & Observability, setting up data quality rules and configuring DQ workflows. Familiarity with Groovy & Java for developing custom workflows and scripts within Collibra. Ability to write Python & SQL for data validation, integration scripts, and automation. Understanding of ETL processes and integrating Collibra with cloud/on-prem databases. Familiarity with data governance frameworks (e.g., GDPR, CCPA, HIPAA) and best practices. Experience in managing technical and business metadata effectively. Ability to track data lineage and assess downstream/upstream data impacts.
Posted 1 month ago
0 years
0 Lacs
Andhra Pradesh
Work from Office
Design and maintain scalable data architectures supporting analytics, reporting, and data science use cases. Develop and optimize complex SQL queries and Spark jobs for data processing and transformation. Architect and implement data pipelines integrating structured and semi-structured data from diverse sources. Collaborate with BI teams to deliver insightful dashboards and visualizations using Tableau and Power BI. Define data governance practices, including data lineage, cataloging, and quality checks. Guide data modeling efforts (3NF, dimensional, star/snowflake) for warehouses and analytical layers. Enable data integration and deployment on cloud platforms such as Snowflake, BigQuery, or Redshift. Required Skills: Strong hands-on experience in SQL, Python, and distributed data processing using Apache Spark. Proven ability to deliver business insights through data analysis and visualizations using Tableau and Power BI. Proficiency in data modeling techniques and building scalable ETL pipelines. Experience with cloud-based data platforms (Snowflake, BigQuery, Redshift) and associated services. Solid understanding of data governance, metadata management, and data cataloging tools (e.g., Collibra, Alation). Knowledge of best practices in data privacy, security, and compliance frameworks (GDPR, HIPAA, etc.). About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 1 month ago
0 - 14 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Job Summary We are seeking an experienced Architect with 10 to 14 years of experience in Data Management Functional Collibra Data Intelligence Cloud DQG Architect and Data Catalog. The ideal candidate will have domain expertise in Travel Intermediaries. This hybrid role requires a candidate who can work effectively in a day shift. No travel is required.The Architect will play a crucial role in shaping our data management strategies and ensuring the integrity and accessibility of our data assets. Responsibilities Lead the design and implementation of data management solutions to support business objectives. Oversee the integration of Collibra Data Intelligence Cloud within the existing data infrastructure. Provide expertise in DQG Architect to ensure data quality and governance across all data assets. Develop and maintain a comprehensive data catalog to enhance data discoverability and usability. Collaborate with cross-functional teams to understand data requirements and deliver appropriate solutions. Ensure compliance with data governance policies and standards within the Travel Intermediaries domain. Conduct regular assessments of data management processes to identify areas for improvement. Implement best practices for data management and governance to enhance data integrity and security. Monitor and optimize the performance of data management systems to ensure efficient operation. Provide technical guidance and support to team members on data management and governance issues. Develop and deliver training programs to enhance data management skills within the organization. Stay updated with the latest trends and technologies in data management and governance. Contribute to the development of data management strategies to support the companys goals. Qualifications Must have extensive experience in Data Management Functional. Should possess strong expertise in Collibra Data Intelligence Cloud. Must have experience as a DQG Architect. Should be proficient in developing and maintaining Data Catalogs. Must have domain expertise in Travel Intermediaries. Should have excellent problem-solving and analytical skills. Must have strong communication and collaboration skills. Should be able to work effectively in a hybrid work model. Must be able to work in a day shift. Should have a proactive approach to identifying and addressing data management issues. Must be committed to continuous learning and professional development. Should be able to work independently and as part of a team. Must have a strong understanding of data governance policies and standards.
Posted 1 month ago
10 years
0 Lacs
Hyderabad, Telangana
Work from Office
Job Information Date Opened 05/13/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Starburst SME – API Integration Experience: 5–10 Years Location: [Hybrid – Hyderabad/Pune] Employment Type: Full-Time About the Role: We are seeking a skilled and motivated Starburst SME with hands-on experience in API integrations , distributed SQL engines, and enterprise data platforms. In this role, you will be responsible for designing, optimizing, and integrating Starburst (Trino) into our broader data ecosystem to enable scalable, real-time data access across multiple sources. Key Responsibilities: Act as the subject matter expert (SME) for Starburst and Trino within the organization. Design and implement data federation strategies using Starburst across cloud, lake, and warehouse platforms. Develop and manage API-based integrations with Starburst for custom applications, BI tools, and automation workflows. Optimize distributed queries for performance and cost efficiency across multiple data sources. Collaborate with engineering, analytics, and DevOps teams to integrate Starburst into the modern data stack. Monitor, troubleshoot, and document performance metrics and usage patterns. Stay up to date with Trino and Starburst features, releases, and best practices. Requirements Required Skills: 5+ years of experience in data engineering or data platform management . Strong knowledge of Starburst/Trino , including connectors, coordinator/worker setup, and security. Experience working with Starburst REST APIs for query execution, metadata access, and integration. Proficient in SQL , query tuning, and working with distributed databases. Familiar with data lake and warehouse ecosystems (e.g., Hive, Iceberg, Delta Lake, Snowflake, Redshift, BigQuery). Experience with Python or similar scripting languages for API integrations. Hands-on with Docker, CI/CD tools , and workflow orchestrators like Airflow or Dagster. Preferred: Experience with data catalogs (Alation, Collibra) and RBAC implementation. Familiarity with monitoring tools like Prometheus/Grafana for Starburst. Understanding of BI integration (Looker, Tableau, Superset) using Starburst. Benefits What We Offer: Opportunity to work with cutting-edge data technologies. A dynamic, fast-paced environment with cross-functional collaboration. Competitive compensation and growth path in a data-driven organization.
Posted 1 month ago
3 - 5 years
0 Lacs
Greater Kolkata Area
Hybrid
You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. How will you make an impact in this role? Responsible for contacting clients with overdue accounts to secure the settlement of the account. Also, they do preventive work to avoid future overdue with accounts that have a high exposure.You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities, and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join #TeamAmex and let’s lead the way together. How we serve our customers is constantly evolving and is a challenge we gladly accept. Whether you’re finding new ways to prevent identity fraud or enabling customers to start a new business, you can work with one of the most valuable data sets in the world to identify insights and actions that can have a meaningful impact on our customers and our business. And, with opportunities to learn from leaders who have defined the course of our industry, you can grow your career and define your own path. Find your place in risk and analytics on #TeamAmex. We are focused on providing the best customer experience everyday through a differentiated set of products and services. With our mix of assets like rewards, benefits, members only perks, we are re-imagining how commerce and experiences converge in a more modern, digital, and connected world. Enterprise Data Office & Platforms (EDGP) is part of the larger Enterprise Digital and Data Solutions (EDDS) organization. EDGP improves the customer experience and drives business growth through robust enterprise-wide data policies and governance and enabling a data-driven culture, while developing digital and data platforms that provide insightful customer relationships and allow users to leverage enterprise-wide data capabilities. The American Express Data Office vision is to realize the potential of our data assets to power the world’s best customer experience. Our mission is to provide the principles, policies, processes, architecture and governance to enable consistent, trusted and usable data to drive accurate data-driven business outcomes. The Enterprise Data Office (EDO) team is part of the Data Office and plays a critical role partnering across the company to deliver on a multi-year roadmap to bring American Express data under governance. The American Express Data Office is a global organization that sits within Enterprise Digital & Data (EDDS) and is American Express’ data center of excellence. The mission of the Data Office is to realize the potential of our data assets to power the world’s best customer experience. The Data Office provides the principles, policies, processes, architecture, and governance to enable consistent, trusted, and usable data to drive accurate data-driven business outcomes. Functional Responsibilities: The Senior Analyst, Enterprise Data Office India Data Activation role will support the activation of Critical data elements (as outlined in AEMP 70 Policy) across business units at American Express. This role will also provide support in training, communications, and change management related to data governance across the enterprise. The Senior Analyst will be responsible for: Data Governance Ownership: Lead the development, implementation, and maintenance of data governance frameworks, policies, and procedures to ensure data integrity, quality, and compliance across the organization. Project Management: Manage data governance projects from inception to completion, including planning, execution, monitoring, and reporting, ensuring alignment with organizational goals and timelines. Risk Data Expertise: Serve as the subject matter expert on risk data management, identifying potential data risks, and implementing mitigation strategies to safeguard the organization's data assets. Collaboration and Communication: Work closely with data stewards, custodians, and various stakeholders to promote data governance awareness, provide training, and ensure adherence to established data management practices. Data Quality Monitoring: Establish and monitor data quality metrics, facilitating the resolution of data quality issues, and implementing continuous improvement initiatives. Regulatory Compliance: Ensure that data governance practices comply with relevant regulations and standards, staying abreast of changes in data protection laws and industry best practices. Tool Implementation: Collaborate with IT teams to implement and manage data governance tools and technologies, such as Collibra, to support data governance activities. Mandatory Skills Required: Expertise in Collibra. Project management skills. Good understanding of Lumi platform. Exposure to RCSA (Risk Control Self-Assessment). Power App Developer Experience. Required Qualifications: Educational Background: Bachelor’s degree in Information Management, Computer Science, Business Administration, or a related field. A Master’s degree is a plus. Experience: Minimum of 3-5 years of experience in data governance, data management, or related roles, preferably within the financial services industry. Project Management Skills: Proven ability to manage multiple projects simultaneously, with strong organizational and time-management skills. Project Management Professional (PMP) certification is desirable. Data Governance Expertise: In-depth understanding of data governance principles, including metadata management, data lineage, data quality, and data stewardship. Familiarity with the Data Management Body of Knowledge (DMBOK) is advantageous. Risk Management Knowledge: Strong knowledge of risk data management practices, including identifying, assessing, and mitigating data-related risks. Technical Proficiency: Experience with data governance tools (e.g., Collibra) and data visualization tools (e.g., Tableau, Power BI). Proficiency in SQL and familiarity with database structures are beneficial. Communication Skills: Excellent verbal and written communication skills, with the ability to convey complex data concepts to non-technical stakeholders and influence across all levels of the organization. Analytical Skills: Strong analytical and problem-solving abilities, with a keen attention to detail. Adaptability: Ability to work independently and take initiative with minimal supervision, demonstrating flexibility and resilience in a dynamic environment. Preferred Qualifications: Certifications: Certified Data Management Professional (CDMP) or equivalent certification. Technical Skills: Experience with data integration tools and familiarity with cloud platforms (e.g., AWS, Lumi, Cornerstone). Industry Knowledge: Understanding of the financial services industry's regulatory environment and data-related compliance requirements. We back our colleagues and their loved ones with benefits and programs that support their holistic well-being. That means we prioritize their physical, financial, and mental health through each stage of life. Benefits include:Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 1 month ago
5 - 8 years
0 Lacs
Greater Kolkata Area
On-site
Key Responsibilities Define, implement, and maintain data standards, policies, and procedures. Ensure data accuracy, consistency, completeness, and integrity across systems. Collaborate with data owners, data architects, and business analysts to resolve data issues and align data across platforms. Monitor and report on data quality metrics and KPIs. Support master data management (MDM) and metadata management efforts. Participate in data governance forums and contribute to documentation and data lineage. Ensure compliance with data privacy regulations and internal policies. Assist in the onboarding of new data sources, ensuring proper classification and cataloging. Required Skills & Qualifications Bachelor's degree in Information Systems, Computer Science, Data Management, or related field. 5+ years of experience in data stewardship, data governance, or data quality roles. Strong knowledge of data governance frameworks, MDM, and data catalog tools (i.e., Collibra, Informatica, Alation). Familiarity with data privacy regulations (GDPR, HIPAA, etc.) Excellent communication and stakeholder management skills. Proficiency in SQL and experience working with data in cloud platforms (AWS, Azure, GCP) is a plus. (ref:hirist.tech)
Posted 1 month ago
5 - 8 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk.Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization toolsDesign and implement comprehensive data analytics strategies to support business decision-making.Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data.Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI)Developing custom scripts and algorithms to automate data processing and analysis to generate insightsApplying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challengesAnalyzing data to uncover trends and generate insights that can inform business decisionsBuild and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management.Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvementBridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managedWork with emerging products to understand risk profile and ensure an appropriate control environment is establishedImplement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analyticsExperience in building analytical queries and dashboards using SQL, noSQL, Python etc.Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusionsKnowledge of tools in the following areas:Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc.)Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc.)Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc.)Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc.)Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc.)Data Mining (e.g., Microsoft SQL Server, etc.)Cloud Platforms (e.g., AWS, Azure, or Google Cloud)Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cyclesAbility to assist management with the integration of security practices in the product development lifecycle (DevSecOps)Experience with homegrown applications in a microservices/dev-ops environmentExperience with identifying potential security risks in platform environments and developing strategies to mitigate themExperience with SOX readiness assessments and control implementationKnowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in:Managing technical data projectsLeveraging data analytics tools/software to develop solutions and scriptsDeveloping statistical model tools and techniquesDeveloping and executing data governance frameworks or operating modelsIdentifying data risks and designing and/or implementing appropriate controlsImplementation of data quality processDeveloping data services and solutions in a cloud environmentDesigning data architectureAnalyzing complex data sets & communicating findings effectivelyProcess management experience, including process redesign and optimizationExperience in scripting languages (e.g., Python, Bash)Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree A minimum of 3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements.We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
5 - 8 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk.Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization toolsDesign and implement comprehensive data analytics strategies to support business decision-making.Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data.Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI)Developing custom scripts and algorithms to automate data processing and analysis to generate insightsApplying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challengesAnalyzing data to uncover trends and generate insights that can inform business decisionsBuild and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management.Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvementBridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managedWork with emerging products to understand risk profile and ensure an appropriate control environment is establishedImplement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analyticsExperience in building analytical queries and dashboards using SQL, noSQL, Python etc.Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusionsKnowledge of tools in the following areas:Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc.)Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc.)Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc.)Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc.)Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc.)Data Mining (e.g., Microsoft SQL Server, etc.)Cloud Platforms (e.g., AWS, Azure, or Google Cloud)Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cyclesAbility to assist management with the integration of security practices in the product development lifecycle (DevSecOps)Experience with homegrown applications in a microservices/dev-ops environmentExperience with identifying potential security risks in platform environments and developing strategies to mitigate themExperience with SOX readiness assessments and control implementationKnowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in:Managing technical data projectsLeveraging data analytics tools/software to develop solutions and scriptsDeveloping statistical model tools and techniquesDeveloping and executing data governance frameworks or operating modelsIdentifying data risks and designing and/or implementing appropriate controlsImplementation of data quality processDeveloping data services and solutions in a cloud environmentDesigning data architectureAnalyzing complex data sets & communicating findings effectivelyProcess management experience, including process redesign and optimizationExperience in scripting languages (e.g., Python, Bash)Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree A minimum of 3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements.We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 month ago
18 - 22 years
0 Lacs
Hyderabad, Telangana, India
Hybrid
DATAECONOMY is one of the fastest-growing Data & AI company with global presence. We are well-differentiated and are known for our Thought leadership, out-of-the-box products, cutting-edge solutions, accelerators, innovative use cases, and cost-effective service offerings. We offer products and solutions in Cloud, Data Engineering, Data Governance, AI/ML, DevOps and Blockchain to large corporates across the globe. Strategic Partners with AWS, Collibra, cloudera, neo4j, DataRobot, Global IDs, tableau, MuleSoft and Talend. Job Title: Delivery HeadExperience: 18 - 22 YearsLocation: HyderabadNotice Period: Immediate Joiners are preferred Job Summary:We are seeking a seasoned Technical Delivery Manager with deep expertise in Data Engineering and Data Science to lead complex data initiatives and drive successful delivery across cross-functional teams. The ideal candidate brings a blend of strategic thinking, technical leadership, and project execution skills, along with hands-on knowledge of modern data platforms, machine learning, and analytics frameworks. Key Responsibilities:Program & Delivery ManagementOversee end-to-end delivery of large-scale data programs, ensuring alignment with business goals, timelines, and quality standards.Manage cross-functional project teams including data engineers, data scientists, analysts, and DevOps personnel.Ensure agile delivery through structured sprint planning, backlog grooming, and iterative delivery.Technical LeadershipProvide architectural guidance and review of data engineering pipelines and machine learning models.Evaluate and recommend modern data platforms (e.g., Snowflake, Databricks, Azure Data Services, AWS Redshift, GCP BigQuery).Ensure best practices in data governance, quality, and compliance (e.g., GDPR, HIPAA).Stakeholder & Client ManagementAct as the primary point of contact for technical discussions with clients, business stakeholders, and executive leadership.Translate complex data requirements into actionable project plans.Present technical roadmaps and delivery status to stakeholders and C-level executives.Team Development & MentoringLead, mentor, and grow a high-performing team of data professionals.Conduct code and design reviews; promote innovation and continuous improvement. Key Skills and Qualifications:Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field.18–22 years of total IT experience with at least 8–10 years in data engineering, analytics, or data science.Proven experience delivering enterprise-scale data platforms, including:ETL/ELT pipelines using tools like Apache Spark, Airflow, Kafka, Talend, or Informatica.Data warehouse and lake architectures (e.g., Snowflake, Azure Synapse, AWS Redshift, Delta Lake).Machine Learning lifecycle management (e.g., model training, deployment, MLOps using MLflow, SageMaker, or Vertex AI).Strong knowledge of cloud platforms (Azure, AWS, or GCP).Deep understanding of Agile, Scrum, and DevOps principles.Excellent problem-solving, communication, and leadership skills. Preferred Certifications (Optional but Beneficial):PMP, SAFe Agile, or similar project management certifications.Certifications in cloud platforms (e.g., AWS Certified Data Analytics, Azure Data Engineer Associate).Certified Scrum Master (CSM) or equivalent.
Posted 1 month ago
8 - 10 years
25 - 27 Lacs
Noida
Work from Office
Experience in designing and implementing business process workflows using Collibra Workflow Designer. Understanding of Collibra Data Governance Center (DGC) and its modules, including Data Catalog, Business Glossary, and Policy Manager. Experience in metadata harvesting, lineage tracking, and governance to improve data visibility. Proficiency in using Collibra REST APIs for workflow automation, data exchange, and custom integrations with other tools. Familiarity with Collibra Data Quality & Observability, setting up data quality rules and configuring DQ workflows. Familiarity with Groovy & Java for developing custom workflows and scripts within Collibra. Ability to write Python & SQL for data validation, integration scripts, and automation. Understanding of ETL processes and integrating Collibra with cloud/on-prem databases. Familiarity with data governance frameworks (e.g., GDPR, CCPA, HIPAA) and best practices. Experience in managing technical and business metadata effectively. Ability to track data lineage and assess downstream/upstream data impacts.
Posted 1 month ago
0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Overview We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. About The Role As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. Key Responsibilities Design and implement modern data architectures using Databricks Lakehouse platform Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems Develop data engineering frameworks and reusable components to accelerate delivery Establish CI/CD pipelines and infrastructure-as-code practices for data solutions Implement data catalog solutions and governance frameworks Create technical specifications and architecture documentation Provide technical leadership to data engineering teams Collaborate with cross-functional teams to ensure alignment of data solutions Evaluate and recommend technologies, tools, and approaches for data initiatives Ensure data architectures meet security, compliance, and performance requirements Mentor junior team members on data architecture best practices Stay current with emerging technologies and industry trends Qualifications Extensive experience in data architecture design and implementation Strong software engineering background with expertise in Python or Scala Proven experience building data engineering frameworks and reusable components Experience implementing CI/CD pipelines for data solutions Expertise in infrastructure-as-code and automation Experience implementing data catalog solutions and governance frameworks Deep understanding of Databricks platform and Lakehouse architecture Experience migrating workloads from legacy systems to modern data platforms Strong knowledge of healthcare data requirements and regulations Experience with cloud platforms (AWS, Azure, GCP) and their data services Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Technical Skills Programming languages: Python and/or Scala (required) Data processing frameworks: Apache Spark, Delta Lake CI/CD tools: Jenkins, GitHub Actions, Azure DevOps Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi Data catalog tools: Databricks Unity Catalog, Collibra, Alation Data governance frameworks and methodologies Data modeling and design patterns API design and development Cloud platforms: AWS, Azure, GCP Container technologies: Docker, Kubernetes Version control systems: Git SQL and NoSQL databases Data quality and testing frameworks Optional - Healthcare Industry Knowledge Healthcare data standards (HL7, FHIR, etc.) Clinical and operational data models Healthcare interoperability requirements Healthcare analytics use cases
Posted 1 month ago
8 - 10 years
25 - 30 Lacs
Noida
Work from Office
Experience in designing and implementing business process workflows using Collibra Workflow Designer. Understanding of Collibra Data Governance Center (DGC) and its modules, including Data Catalog, Business Glossary, and Policy Manager. Experience in metadata harvesting, lineage tracking, and governance to improve data visibility. Proficiency in using Collibra REST APIs for workflow automation, data exchange, and custom integrations with other tools. Familiarity with Collibra Data Quality & Observability, setting up data quality rules and configuring DQ workflows. Familiarity with Groovy & Java for developing custom workflows and scripts within Collibra. Ability to write Python & SQL for data validation, integration scripts, and automation. Understanding of ETL processes and integrating Collibra with cloud/on-prem databases. Familiarity with data governance frameworks (e.g., GDPR, CCPA, HIPAA) and best practices. Experience in managing technical and business metadata effectively. Ability to track data lineage and assess downstream/upstream data impacts.
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Accountabilities Investigate, troubleshoot, and resolve data related production issues. Provide timely reporting on data quality metrics and trends. Document and maintain support procedures for data quality processes. Collaborate with IT and business teams to implement data quality improvements. Ensure data validation and reconciliation processes are followed. Engage with stakeholders to establish procedures for data validation and quality metrics. Track data issues using incident tickets and ensure timely resolution or escalate issues for immediate attention if not resolved. Maintain and update production support dashboards (Microsoft Power BI) to ensure accuracy and meet monitoring requirements. Develop Data Quality health reports for stakeholders to monitor and observe data reliability across the platform. Creating and maintaining documentation procedures, and best practices of data governance and related processes Provide training to users on tools to promote awareness and adherence. Collaborating with data owners and data stewards to ensure data governance is implemented and followed. Able to work with vendor as there will be technical platform issues that requires coordination and solution. Deliver consistent, accurate and high- quality work while communicating findings and insights in a clear manner. Experience / Qualifications At least 4 years of hands-on experience with a Data Quality tool (Collibra is preferred), Databricks and Microsoft Power BI Strong technical skills in data and database management, with proficiency in data wrangling, analytics, and transformation using Python and SQL Asset Management experience will be beneficial to understand and recommend the required data quality rules and remediation plan to the stakeholders. Other Attributes Curious, analytical, and able to think critically to solve problems Detail-oriented and comfortable dealing with complex structured and unstructured datasets Customer-centric and strive to deliver value by effectively and proactively engaging stakeholders Clear and effective communication skills, with an ability to communicate complex ideas and manage stakeholder expectations Strong organisational and prioritisation skills, adaptable and able to work independently as required Show more Show less
Posted 1 month ago
10 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of the role is overall development of the target Architecture through defining the technology roadmap for own business/ domain. Recognized as the subject matter expert for a specific domain and provides expert advice & guidance to different business stakeholders ͏ Do Develop architectural application for the new deals/ major change requests in existing deals Creates enterprise wide business/ domain architecture deliverables (enabling, diagnostic and actionable) focused on the target audience and its issues and opportunities Look for opportunities to use high-level business and operating models (business capability and value chain), combined with or relating to business, people, information, technology and solutions. Contributes to the Target Architecture, by developing and maintaining the technology roadmap for area of expertise and ensuring that roadmap remains aligned to the Business Strategy Recognizing innovative use of technology for increasing performance measures Works with other IS and business stakeholders to drive the development and adoption of the target architecture for own domain Establish domain specific standards, near/mid-term strategy, and roadmaps, in adherence to, and in support of Enterprise standards, strategy, and roadmaps. Guide a solution from concept to delivery - envision and create solutions that meet requirements Prove the feasibility of a design; and can ultimately be implemented and supported in the Production environment Oversee product/ platform engineering, protocol map development, virtualization as per the business solution requirements Apply architectural and engineering concepts to design a solution that meets operational requirements, such as scalability, maintainability, security, reliability, extensibility, flexibility, availability, and manageability. Participate and lead research and development efforts (proof of concept, prototypes), as subject matter experts, when introducing new technologies, in conjunction with team and Product Owners ͏ Partners with IT and line of business functional groups to communicate and clarify business needs, contributes to the development of long-range system plans, and ensures that IT products, services and processes are aligned with line of business needs. Define high-level migration plans to address the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes Provides technology consulting to solution architects, junior staff members, and others who are using or modifying multiple domain technologies within a solution, insuring the technology operates coherently to meet overall needs Interaction with EA, OEMs, Technical leads for defining business solutions Depending on the client’s need with particular standards and technology stacks create complete RFPs Clearly articulate and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Provide solution of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of all the business/ domain requirements including systems, shared infrastructure services in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology Define and understand current issues and problems and identify improvements Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout Understand the root cause problem in integrating business and product units Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Collaborating with sales and delivery leadership teams to identify future needs and requirements Tracks industry and application trends and relates these to planning current and future IT needs ͏ Understanding enterprise requirements and provide solutions for technical ecosystem Creating Intellectual Property in forms of services, patterns, models and organizational approaches. Bring knowledge of automation in application by embracing Agile and dev-ops principles to reduce manual part Responsible for successfully applying the technology in their domain to solve business problems in a supportable, cost effective, way Analyze the current technology environment to detect critical deficiencies and recommend solutions for improvement. In addition, analyze the technology industry and market trends to determine their potential impact on the enterprise as well as on the enterprise technology architecture Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Provide platform to create standardize tools, uniform design and techniques are maintained to reduce costs of maintenance Seamless integration and advising of new and existing systems to eliminate potential problems and maintain data structure and bring value in terms of development Serve as technical owner and point of contact for domain specific solutions and provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology ͏ Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Collaborate with enterprise architect for translating business strategy to execution Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Bring value in terms of quality in development activities by leveraging cloud based and scalable infrastructure Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Provide architectural oversight of projects; ensure requirements are in alignment with business strategies and business architecture roadmap/framework. Ensure solutions developed across organization are aligned to enterprise architecture standards and principles, leverage common solutions and services, and meet financial targets (cost and benefits). Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Collibra. Experience: >10 YEARS. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
5 - 8 years
25 - 30 Lacs
Noida, Uttar Pradesh, India
On-site
Tips: Provide a summary of the role, what success in the position looks like, and how this role fits into the organization overall. Responsibilities [Be specific when describing each of the responsibilities. Use gender-neutral, inclusive language.] Example: Determine and develop user requirements for systems in production, to ensure maximum usability Qualifications [Some qualifications you may want to include are Skills, Education, Experience, or Certifications.] Example: Excellent verbal and written communication skills Skills: etl processes,sql,data modeling,python,business intelligence,data quality,dgc,communication skills,data governance,collibra
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Take a lead role in acquiring, managing and retaining meaningful relationships that deliver outstanding experience to our customers. In this role, you will balance your focus on business results by offering options and finding solutions to help our customers with issues. Job Summary: As a Payment Lifecycle Associate within JPMorgan Chase, you will play a crucial role in upholding the company's strength and resilience. Your contributions will be instrumental in fostering the firm's growth responsibly, as you anticipate new and emerging risks and apply your expert judgement to tackle real-world challenges impacting our company, customers, and communities. You will be part of a culture that promotes innovative thinking, challenges the status quo, and aims for best-in-class performance. Job Responsibilities: Manage the end-to-end feed onboarding process of new data feeds, including gathering requirements and UAT testing. Ensure the new feeds meet the necessary standards of accuracy and completeness, and monitor the performance of data feeds post-integration to ensure they continue to function optimally. Perform data analysis and reconciliations to identify data quality issues and their impact on CAM. Investigate the root causes of data quality issues and work with teams to implement corrective actions. Manage data issues across CAM by effectively collaborating with CAM service leads, various functional leads from LOBs, Tech, and Ops to document data issues, root causes, and communicate remediation plans. Participate in UAT testing, review test scripts, and create business requirement documentation (BRD). Generate and present data quality reports, tracking key metrics and providing insights to business stakeholders. Update and maintain procedure documentation and process flows. Required Qualifications, Skills and Capabilities: Excellent written and verbal communication Experience of building relationships with technology teams to clearly explain requirements, user stories and functionality Understanding of financial products like equities, fixed income, payments is required to work closely with external feed providers to onboard data. Ability to gather, analyze and interpret large datasets t extract valuable insights that help drive business strategies. Exemplifies the highest standards of integrity, respects individuals at every level and can flex style Intermediate Excel skills required for data analysis (e.g. Use of pivot tables and v-lookups Works independently and manage requests from multiple groups, define business requirement specifications working with users. Take ownership of ensuring the requirement is supported and built end to end solution. Ability to think strategically but have strong attention to detail. Team player, with strong interpersonal and networking skills Strong organizational skills - ability to take strategic direction and independently develop/manage project plans/Multi task and deliver against deadlines Preferred Qualifications, Skills and Capabilities: Experience in Cash, Payments or Finance. Proficiency in using data quality tools and techniques like SQL, OWL, Collibra and others. Track record of defining and successful delivery of new solutions from concept to launch and rollout. Problem solving skills - superior ability to structure and scope complex problems, apply a range of technology/analytical tools, gain and synthesize insights and develop recommendations Helps promote a client/customer centred organization Desire to receive feedback and continuously improve, demonstrating a growth mindset
Posted 1 month ago
5 - 8 years
10 - 19 Lacs
Bengaluru
Work from Office
Bangalore Seeking an individual with 5+ years of experience in Data Analysis along with hands on experience in Collibra or similar metadata management tool Your Future Employer- Global Insurance Broking and Risk management firm Responsibilities- Identify and define problems, develop effective solutions, and possess technical skills in data analysis, SQL, programming, statistical analysis, and data visualization. Work effectively in Agile environments, prioritize tasks, manage time, and meet deadlines while handling multiple projects simultaneously Be adaptable and flexible in a fast-paced environment, quickly learn new tools and technologies, and ensure data quality and accuracy in all analyses and documentation. Requirements- Skilled in SQL, Python, or R for data analysis. Agile Methodologies and Business Process Understanding: Familiarity with agile methodologies and business process modelling to effectively manage and deliver projects. Management and Documentation: Proficiency in documenting user stories and managing requirements to ensure clear and actionable project deliverables. What is in it for you? A stimulating working environment with equal employment opportunities. Growing of skills while working with industry leaders and top brands. A meritocratic culture with great career progression. Reach us- If you feel that you are the right fit for the role please share your updated CV at stessy.jimmy@crescendogroup.in Disclaimer- Crescendo Global specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Note: We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in one week. Your patience is highly appreciated. Profile Keywords- Data Analysis, sql,python,R, Collibra, Metadata management
Posted 2 months ago
6 - 9 years
8 - 13 Lacs
Chennai
Work from Office
Skills Proficiency in data quality tools and methodologies. SQL and Snowflake knowledge ATLAN and Collibra tools Understanding of data privacy laws and compliance regulations. Ability to work with IT teams to establish best practices for data systems. Strong communication skills for interacting with various stakeholders. Atleast a knowledge on ATLANTool experience is must
Posted 2 months ago
6 - 10 years
18 - 30 Lacs
Bengaluru
Remote
Design, develop, and optimize data pipelines and ETL/ELT workflows on AWS. Exp Data engineering and data pipeline development. AWS services, especially Redshift, Glue, S3, and Athena. Apache Iceberg or formats (like Delta Lake or Hudi). Required Candidate profile Legacy tools like Siebel, Talend, and Informatica Data governance tools like Atlan, Collibra, or Alation. Data quality checks using Soda or equivalent. Strong SQL and Python skills, Spark
Posted 2 months ago
5 - 8 years
8 - 18 Lacs
Hyderabad
Hybrid
Data Quality Engineer Exp-5-8 years Location-Hyderabad Summary: We are seeking a motivated and detail-oriented Data Quality Engineer to join our team and contribute to ensuring the accuracy, completeness, and reliability of our data. In this role, you will be responsible for implementing data quality checks, monitoring data quality, and identifying and resolving data quality issues. Experience with SODA (or similar data quality frameworks) is preferred. Responsibilities: Implement and maintain data quality checks and validations using various tools and techniques. Develop and execute data quality test plans and test cases. Automate data quality processes, including data profiling, data quality checks, and reporting. Contribute to the implementation and maintenance of data quality frameworks and tools. (Preferred) Utilize SODA (or similar frameworks) to define data quality checks, configure data sources, and generate data quality reports. Collaborate with data engineers, data analysts, and other stakeholders to ensure data quality requirements are met. Communicate data quality issues and findings to relevant stakeholders. Investigate and analyze data quality problems to determine root causes.
Posted 2 months ago
8 - 12 years
27 - 35 Lacs
Bengaluru
Work from Office
Urgent Hiring Solution Architect (Data Platforms) | Bangalore Skills Required: Data lineage & metadata management ETL tools & cloud platforms (GCP, AWS, Azure) Data modeling (conceptual, logical, physical) Enterprise-level architecture & data flow diagrams Strong collaboration & documentation skills Experience with Collibra & Data Governance tools Hands-on with GCP, AWS, Azure
Posted 2 months ago
5 - 10 years
0 - 1 Lacs
Mumbai Suburbs, Mumbai, Mumbai (All Areas)
Work from Office
Experience in Data Steward, data management, data governance, data quality, and data engineering, leadership roles, Collibra, Alation, Data privacy, compliance. Loc- Mumbai- Thane Exp- 4+ yrs NP -1M- 45 days Apply/ Share to preethi.kumar@harjai.com
Posted 2 months ago
4 - 8 years
9 - 14 Lacs
Pune
Remote
Technical Responsibilities: Data Quality Assessment: Conduct regular data quality assessments to identify and address data anomalies, inconsistencies, and inaccuracies. Data Profiling: Utilize data profiling techniques to analyze data attributes, identify data quality issues, and assess data completeness. Data Cleansing: Develop and implement data cleansing processes to correct errors, standardize data formats, and improve data quality. Data Validation: Create and implement data validation rules to ensure data integrity and consistency. Metadata Management: Manage metadata repositories and ensure accurate and up-to-date metadata is maintained. Tool Utilization: Proficiently use data governance and data quality tools like IDMC or Collibra to manage data lineage, impact analysis, and data quality metrics. Data Governance Framework: Contribute to the development and maintenance of a comprehensive data governance framework, including policies, standards, and procedures. Functional Responsibilities: Stakeholder Engagement: Collaborate with business users, data owners, and IT teams to understand data requirements and ensure data quality meets business needs. Data Stewardship: Support data stewards in their role of ensuring data quality and compliance. Data Quality Metrics: Develop and track key data quality metrics to measure the effectiveness of data governance initiatives. Issue Resolution: Investigate and resolve data quality issues in a timely manner. Continuous Improvement: Identify opportunities for process improvement and implement best practices to enhance data quality. Change Management: Manage changes to data definitions, data sources, and data processes to minimize disruptions.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2