Jobs
Interviews

866 Collibra Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

7 - 11 Lacs

Mumbai

Work from Office

Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 2 months ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs How You Will Contribute You will: Operationalize and automate activities for efficiency and timely production of data visuals Assist in providing accessibility, retrievability, security and protection of data in an ethical manner Search for ways to get new data sources and assess their accuracy Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation Validate information from multiple sources. Assess issues that might prevent the organization from making maximum use of its information assets What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data Ability to simplify complex problems and communicate to a broad audience In This Role As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Data Science Analytics & Data Science

Posted 2 months ago

Apply

3.0 years

0 Lacs

India

Remote

AWS Data Engineer Location: Remote (India) Experience: 3+ Years Employment Type: Full-Time About the Role: We are seeking a talented AWS Data Engineer with at least 3 years of hands-on experience in building and managing data pipelines using AWS services. This role involves working with large-scale data, integrating multiple data sources (including sensor/IoT data), and enabling efficient, secure, and analytics-ready solutions. Experience in the energy industry or working with time-series/sensor data is a strong plus. Key Responsibilities: Build and maintain scalable ETL/ELT data pipelines using AWS Glue, Redshift, Lambda, EMR, S3, and Athena Process and integrate structured and unstructured data, including sensor/IoT and real-time streams Optimize pipeline performance and ensure reliability and fault tolerance Collaborate with cross-functional teams including data scientists and analysts Perform data transformations using Python, Pandas, and SQL Maintain data integrity, quality, and security across the platform Use Terraform and CI/CD tools (e.g., Azure DevOps) for infrastructure and deployment automation Support and monitor pipeline workflows, troubleshoot issues, and implement fixes Contribute to the adoption of emerging tools like AWS Bedrock, Textract, Rekognition, and GenAI solutions Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field 3+ years of experience in data engineering using AWS Strong skills in: AWS Glue, Redshift, S3, Lambda, EMR, Athena Python, Pandas, SQL RDS, Postgres, SAP HANA Solid understanding of data modeling, warehousing, and pipeline orchestration Experience with version control (Git) and infrastructure as code (Terraform) Preferred Skills: Experience working with energy sector dat a or IoT/sensor-based data Exposure to machine learnin g tools and frameworks (e.g., SageMaker, TensorFlow, Scikit-learn) Familiarity with big data technologie s like Apache Spark, Kafka Experience with data visualization tool s (Tableau, Power BI, AWS QuickSight) Awareness of data governance and catalog tool s such as AWS Data Quality, Collibra, and AWS Databrew AWS Certifications (Data Analytics, Solutions Architect

Posted 2 months ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Associate Director, Data and Analytics Strategy & Architecture – Enterprise Data Enablement THE OPPORTUNITY Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Lead an Organization driven by digital technology and data-backed approaches that supports a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be the leaders who have a passion for using data, analytics, and insights to drive decision-making, which will allow us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. AN integral part of our company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview As a Lead Technical Data and Analytics Architect with a primary focus on Enterprise Data Enablement and Data Governance, you will play a pivotal leadership role in shaping the future of our company enterprise data enablement and governance initiatives. This position combines strategic technology leadership with hands-on technical expertise. You will be supporting our Discover product line, which encompasses the Enterprise Data Marketplace, Data Catalog, and Enterprise Data Access Control products. This role is pivotal in understanding the current architecture, adoption patterns, and product strategy while helping to design the architecture for the next generation of Discover. You will create and implement strategic frameworks, ensure their adoption within product teams, and oversee the consistent management of technologies. You will work closely with the product team to establish and govern the future architecture, ensuring it evolves beyond traditional data products to include AI models, visualizations, insights assets, and more. You will play a key role in driving innovation, modularity, and scalability within the Discover ecosystem, aligning with the organization's strategic vision. What Will You Do In The Role Strategic Leadership Develop and maintain a cohesive Data Enablement architecture vision, aligning with our company's business objectives and industry trends. Provide leadership to a team of product owners and engineers in our Discover Product line, mentoring and guiding them to achieve collective goals and deliverables. Foster a collaborative environment where innovation and best practices thrive. Integration and Innovation Design and implement architectural solutions to enable seamless integration between Enterprise Data Marketplace, Data Catalog, and Enterprise Data Access Control products. Enhance API usage and drive the transition to a microservice-based architecture for greater modularity and scalability. Support the integration of Collibra and Immuta platforms with compute engines like Glue, Trino Starburst, and Databricks to optimize Discover’s capabilities. Technical Leadership and Collaboration Collaborate with cross-functional teams, including engineering, product management, and other stakeholders, to align on architecture strategy and implementation. Partner with the product team to define roadmaps and ensure architectural alignment with the organization's goals. Act as a trusted advisor, providing technical leadership and driving best practices for architectural governance. Governance and Security Ensure all architectural designs adhere to organizational policies, data governance requirements, and security standards. Evolve data governance practices to accommodate diverse assets, including AI models and visualizations, alongside traditional data products. Optimization and Future-Readiness Identify opportunities for system optimization, modernization, and cost-efficiency. Lead initiatives to future-proof the architecture, supporting scalability for increasing demands across data products and advanced analytics. Framework Development and Governance Create capability and technology maps for Data Enablement and Governance, reference architectures, innovation trend maps, and architecture blueprints and patterns. Ensure the consistent application of frameworks across product teams. Hands-on Contribution Actively participate in technical problem-solving, proof-of-concept development, and implementation activities. Provide hands-on technical leadership to support your team and deliver high-value outcomes. Cross-functional Collaboration Partner with enterprise and product architects to ensure alignment and synergy across the organization. Engage with stakeholders to align architectural decisions with broader business goals. Collaborate with internal Strategy and Architecture team Architecture lead and Architects to ensure the smooth integration of Data Enablement Technologies with other Data and Analytics eco system products What Should You Have Hands-on experience with platforms like Collibra, Immuta, and Databricks, and deep knowledge of data governance and access control frameworks. Strong understanding of architectural principles, API integration strategies, and microservice-based design Proficiency in designing modular, scalable architectures that align with data product and data mesh principles. Expertise in supporting diverse asset types, including AI models, visualizations, and insights assets, within enterprise ecosystems. Knowledge of cloud platforms (AWS preferred) and containerization technologies (Docker, Kubernetes). Proven ability to align technical solutions with business objectives and strategic goals. Strong communication skills, with the ability to engage and influence technical and non-technical stakeholders. Exceptional problem-solving and analytical skills, with a focus on practical, future-ready solutions. Self-driven and adaptable, capable of managing multiple priorities in a fast-paced environment. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are intellectually curious, join us—and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Enterprise Architecture (BEA), Business Process Modeling, Data Modeling, Emerging Technologies, Requirements Management, Solution Architecture, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills Job Posting End Date 07/30/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R345606

Posted 2 months ago

Apply

3.0 years

0 Lacs

India

Remote

Job Title: AWS Data Engineer 📍 Location: Remote (India) 🕒 Experience: 3+ Years 💼 Employment Type: Full-Time About the Role: We’re looking for a skilled AWS Data Engineer with 3+ years of hands-on experience in building and managing robust, scalable data pipelines using AWS services. The ideal candidate will have a strong foundation in processing both structured and unstructured data, particularly from IoT/sensor sources. Experience in the energy sector and with time-series data is highly desirable. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines using AWS Glue, Redshift, Lambda, EMR, S3, and Athena Integrate and process structured, unstructured, and real-time sensor/IoT data Ensure pipeline performance, reliability, and fault tolerance Collaborate with data scientists, analysts, and engineering teams to build analytics-ready solutions Transform data using Python, Pandas , and SQL Enforce data integrity, quality, and security standards Use Terraform and CI/CD tools (e.g., Azure DevOps) for infrastructure and deployment automation Monitor workflows, troubleshoot pipeline issues, and implement solutions Explore and contribute to the use of modern AWS tools like Bedrock, Textract, Rekognition , and GenAI applications Required Skills & Qualifications: Bachelor’s/Master’s in Computer Science, IT, or related field Minimum 3 years of experience in AWS data engineering Proficient in: AWS Glue, Redshift, S3, Lambda, EMR, Athena Python, Pandas, SQL RDS, Postgres, SAP HANA Strong knowledge of data modeling, warehousing, and pipeline orchestration Experience with Git and Infrastructure as Code using Terraform Preferred Skills: Experience with energy sector data or sensor-based/IoT data Exposure to ML tools like SageMaker, TensorFlow, Scikit-learn Familiarity with Apache Spark, Kafka Experience with data visualization tools: Tableau, Power BI, AWS QuickSight Awareness of data governance tools like AWS Data Quality, Collibra, Databrew AWS certifications (e.g., Data Analytics Specialty, Solutions Architect Associate)

Posted 2 months ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Wolters Kluwer Wolters Kluwer is a global leader in professional information, software solutions, and services in the following sectors: health, tax & accounting, corporate performance & ESG, financial corporate compliance, and legal & regulatory. Wolters Kluwer, headquartered in Alphen aan den Rijn, the Netherlands, reported 2024 annual revenues of €5.9 billion. The group serves customers in over 180 countries, maintains operations in over 40 countries, and employs approximately 21,700 people worldwide. Roles & Responsibilities Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality Qualification Should have at least 3+Yrs of work experience in Data Governance, should have work experience in designing and implementation of Data Governance Preferred: - Microsoft Purview Must have - Collibra Experience - 3 to 8 Yrs Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process.

Posted 2 months ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The IT Business Senior Analyst is an intermediate-level position responsible for liaising between business users and technologists to exchange information in a concise, logical and understandable way in coordination with the Technology team. The overall objective of this role is to contribute to continuous iterative exploration and investigation of business performance and other measures to gain insight and drive business planning. Responsibilities: Formulate and define systems scope and objectives for complex projects and foster communication between business leaders and IT Consult with users and clients to solve complex system issues/problems through in-depth evaluation of business processes, systems and industry standards and recommends solutions Support system change processes from requirements through implementation and provide input based on analysis of information Consult with business clients to determine system functional specifications and provides user and operational support Identify and communicate risks and impacts, considering business implications of the application of technology to the current business environment Act as advisor or coach to new or lower level analysts and work as a team to achieve business objectives, performing other duties and functions as assigned Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 5-8 years of relevant experience Experience in data analysis with intermediate/advanced Microsoft Office Suite skills Proven interpersonal, data analysis, diplomatic, management and prioritization skills Consistently demonstrate clear and concise written and verbal communication Proven ability to manage multiple activities and build/develop working relationships Proven self-motivation to take initiative and master new tasks quickly Demonstrated ability to work under pressure to meet tight deadlines and approach work methodically with attention to detail Education: Bachelor's degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Position Overview: We are seeking an experienced and dynamic IT Business Senior Analyst (Assistant Vice President) to work on initiatives related to Data Governance and Control Codification projects. The ideal candidate should have a strong understanding of data quality , control frameworks , and codification processes , along with extensive knowledge of the banking and finance domain . This role requires a blend of technical expertise, business acumen, and leadership skills to ensure the successful delivery of data governance initiatives. Key Responsibilities: Senior Analyst in Data Governance and Control Codification projects, ensuring alignment with organizational goals and regulatory requirements. Define and implement data quality frameworks, standards, and processes to ensure the accuracy, consistency, and reliability of data. Collaborate with cross-functional teams to identify, document, and codify controls for critical data elements. Work closely with stakeholders to understand business requirements and translate them into actionable technical solutions. Ensure compliance with data governance policies, regulatory standards, and industry best practices. Drive the adoption of data governance tools and technologies to enhance data quality and control processes. Provide subject matter expertise in banking and finance, ensuring that data governance initiatives align with industry-specific requirements. Monitor and report on the effectiveness of data governance and control frameworks, identifying areas for improvement. Mentor and guide team members, fostering a culture of accountability and continuous improvement. Required Skills and Qualifications: 8+ years of experience in IT Business Analysis, with a focus on Data Governance, Data Quality, and Control Codification. Strong understanding of data quality frameworks, data lineage, and metadata management. Experience in the banking and finance domain, with knowledge of regulatory requirements and industry standards. Proficiency in data governance tools (e.g., Collibra, Informatica, or similar) and data quality tools. Strong analytical and problem-solving skills, with the ability to work with large datasets and complex systems. Excellent communication and stakeholder management skills, with the ability to bridge the gap between technical and business teams. Bachelor's or Master's degree in Computer Science, Information Systems, Finance, or a related field. Preferred Qualifications: Experience with control frameworks such as COSO, COBIT, or similar. Knowledge of data privacy regulations (e.g., GDPR, CCPA) and their impact on data governance. Familiarity with data visualization tools (e.g., Tableau, Power BI) for reporting and analysis. Certifications in data governance or related fields (e.g., CDMP, DGSP). ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Business Analysis / Client Services ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 months ago

Apply

0 years

0 Lacs

Greater Hyderabad Area

On-site

Overview We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. About The Role As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. Key Responsibilities Design and implement modern data architectures using Databricks Lakehouse platform Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems Develop data engineering frameworks and reusable components to accelerate delivery Establish CI/CD pipelines and infrastructure-as-code practices for data solutions Implement data catalog solutions and governance frameworks Create technical specifications and architecture documentation Provide technical leadership to data engineering teams Collaborate with cross-functional teams to ensure alignment of data solutions Evaluate and recommend technologies, tools, and approaches for data initiatives Ensure data architectures meet security, compliance, and performance requirements Mentor junior team members on data architecture best practices Stay current with emerging technologies and industry trends Qualifications Extensive experience in data architecture design and implementation Strong software engineering background with expertise in Python or Scala Proven experience building data engineering frameworks and reusable components Experience implementing CI/CD pipelines for data solutions Expertise in infrastructure-as-code and automation Experience implementing data catalog solutions and governance frameworks Deep understanding of Databricks platform and Lakehouse architecture Experience migrating workloads from legacy systems to modern data platforms Strong knowledge of healthcare data requirements and regulations Experience with cloud platforms (AWS, Azure, GCP) and their data services Bachelor's degree in computer science, Information Systems, or related field; advanced degree preferred Technical Skills Programming languages: Python and/or Scala (required) Data processing frameworks: Apache Spark, Delta Lake CI/CD tools: Jenkins, GitHub Actions, Azure DevOps Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi Data catalog tools: Databricks Unity Catalog, Collibra, Alation Data governance frameworks and methodologies Data modeling and design patterns API design and development Cloud platforms: AWS, Azure, GCP Container technologies: Docker, Kubernetes Version control systems: Git SQL and NoSQL databases Data quality and testing frameworks Optional - Healthcare Industry Knowledge Healthcare data standards (HL7, FHIR, etc.) Clinical and operational data models Healthcare interoperability requirements Healthcare analytics use cases

Posted 2 months ago

Apply

0 years

0 Lacs

India

On-site

Overview We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. About The Role As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. Key Responsibilities Design and implement modern data architectures using Databricks Lakehouse platform Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems Develop data engineering frameworks and reusable components to accelerate delivery Establish CI/CD pipelines and infrastructure-as-code practices for data solutions Implement data catalog solutions and governance frameworks Create technical specifications and architecture documentation Provide technical leadership to data engineering teams Collaborate with cross-functional teams to ensure alignment of data solutions Evaluate and recommend technologies, tools, and approaches for data initiatives Ensure data architectures meet security, compliance, and performance requirements Mentor junior team members on data architecture best practices Stay current with emerging technologies and industry trends Qualifications Extensive experience in data architecture design and implementation Strong software engineering background with expertise in Python or Scala Proven experience building data engineering frameworks and reusable components Experience implementing CI/CD pipelines for data solutions Expertise in infrastructure-as-code and automation Experience implementing data catalog solutions and governance frameworks Deep understanding of Databricks platform and Lakehouse architecture Experience migrating workloads from legacy systems to modern data platforms Strong knowledge of healthcare data requirements and regulations Experience with cloud platforms (AWS, Azure, GCP) and their data services Bachelor's degree in computer science, Information Systems, or related field; advanced degree preferred Technical Skills Programming languages: Python and/or Scala (required) Data processing frameworks: Apache Spark, Delta Lake CI/CD tools: Jenkins, GitHub Actions, Azure DevOps Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi Data catalog tools: Databricks Unity Catalog, Collibra, Alation Data governance frameworks and methodologies Data modeling and design patterns API design and development Cloud platforms: AWS, Azure, GCP Container technologies: Docker, Kubernetes Version control systems: Git SQL and NoSQL databases Data quality and testing frameworks Optional - Healthcare Industry Knowledge Healthcare data standards (HL7, FHIR, etc.) Clinical and operational data models Healthcare interoperability requirements Healthcare analytics use cases

Posted 2 months ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

📌 Job Title: Data Analyst – Cataloging 📍 Location: Mumbai (Work from Office) 🕒 Shift Timing: 10:30 AM – 7:30 PM 🧑‍💼 Experience Required: 3–5 Years 📅 Notice Period: 30 To 45Days Key Responsibilities Review, analyze, and resolve data quality issues across IM Data Architecture. Coordinate onboarding of data from internal/external sources into centralized repositories. Collaborate with Data Owners and IT teams to define and implement Data Quality (DQ) rules. Perform end-to-end analysis of business processes and data flows to enhance productivity. Engage with global stakeholders to identify and prioritize data needs. Document business requirements and support data-related project delivery. Manage change control processes and participate in UAT. Implement data governance frameworks and support strategic DAG initiatives. Key Skills Required Strong communication and stakeholder management skills. Hands-on experience with SQL for data profiling and analysis. Experience in Private Equity domain and financial services. Familiarity with cataloging tools such as Collibra . Good to Have Exposure to BI tools like Tableau or Power BI . Working knowledge of Python for data scripting and automation. Note: We are looking for candidates from the Investment Banking (IB) domain. Interested candidates can send their updated resume to: 📧 swagatika.s@twsol.com

Posted 2 months ago

Apply

8.0 years

0 Lacs

India

Remote

About Company Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. About The Role We are seeking a skilled and experienced Technical Project Manager to lead complex projects in Identity and Access Management (IAM) and Data Privacy domains. This role requires a strong background in Azure Cloud Security , agile project management, and cross-functional stakeholder collaboration. The ideal candidate will oversee multiple concurrent projects, align technology with strategic objectives, and ensure secure, scalable delivery. Key Responsibilities Lead end-to-end project planning, execution, and delivery for security engineering and data privacy initiatives Coordinate Agile ceremonies (Scrum, Sprint Planning, Retrospectives) and ensure SAFe Agile compliance Manage cross-functional teams and align onsite-offshore delivery efforts Design and implement IAM solutions using technologies such as PlainID, Microsoft Entra ID, SailPoint, and AWS IAM Oversee integration of IAM and Data Privacy tools like OneTrust, Microsoft Priva, Informatica DPM, and Collibra Create project documentation, architecture diagrams, risk assessments, and compliance reports Support CI/CD best practices using Azure DevOps Pipelines Enforce data governance policies and access control frameworks Provide technical leadership across solution design and architecture reviews Monitor delivery timelines, budgets, and risk mitigation plans Conduct performance audits, security assessments, and gap analyses Required Skills & Experience Master’s degree in Computer Science, IT, or related field 8+ years of project management experience (including 5+ in IAM/security/data privacy projects) Strong Agile/SAFe project delivery and stakeholder management experience Hands-on experience with IAM tools (e.g. PlainID, Microsoft Entra ID, SailPoint, Okta) Experience with Azure Cloud Security practices and infrastructure Familiarity with Data Privacy platforms like OneTrust, Informatica DPM, Microsoft Priva Proven experience in reference architecture design and secure policy enforcement Excellent communication and leadership skills with ability to manage global delivery teams Familiarity with Azure DevOps for CI/CD, workflow automation, and reporting Benefits & Perks Opportunity to work with leading global clients Flexible work arrangements with remote options Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities Skills: microsoft priva,plainid,ci/cd,azure cloud security,onetrust,technical project manager,okta,azure cloud,agile project management,informatica dpm,data privacy,sailpoint,microsoft entra id,security engineering,project delivery,collibra,agile,access governance,azure devops,iam

Posted 2 months ago

Apply

2.0 - 7.0 years

6 - 15 Lacs

Chennai

Hybrid

We are hiring for Collibra Role With Experience in Data Governance, Integration and workflows Kindly share your resume @ Ravina.m@rbsaltech.com

Posted 2 months ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary We are looking for a seasoned Project Manager with a strong background in Google Cloud Platform (GCP) and DevOps methodologies. The ideal candidate will be responsible for planning, executing, and finalizing projects according to strict deadlines and within budget. This includes acquiring resources and coordinating the efforts of team members and third-party contractors or consultants in order to deliver projects according to plan. The GCP DevOps Project Manager will also define the project’s objectives and oversee quality control throughout its life cycle. Key Responsibilities: ● Lead end-to-end planning, execution, and delivery of Data Foundation initiatives across multiple workstreams (e.g., Data Lake, Observability, IAM, Metadata, Ingestion Pipelines). ● Coordinate across platform, engineering, data governance, cloud infrastructure, and business teams to ensure alignment on scope, dependencies, and delivery timelines. ● Own program-level tracking of deliverables, milestones, risks, and mitigation plans. ● Drive platform enablement efforts (e.g., GCP/AWS setup, Kafka, BigQuery, Snowflake, IAM, monitoring tooling) and ensure their operational readiness. ● Manage stakeholder communications, steering committee updates, and executive reporting. ● Define and manage program OKRs, KPIs, and success metrics. ● Lead technical discussions to assess readiness, unblock execution, and ensure architectural alignment. ● Support cross-team collaboration on data security, access management, observability (Grafana, Prometheus, SIEM), and operational automation. ● Manage vendor relationships and coordinate delivery with third-party partners where applicable. Required Skills and Qualifications ● 8+ years of experience in Technical Program Management or Engineering Program Management roles. ● Proven experience in leading data platform or data foundation programs in a cloud-native environment (GCP, AWS, or Azure). ● Strong knowledge of data platform components : data lakes, ingestion pipelines, metadata tools (e.g., Marquez, Collibra), observability (Grafana, Prometheus), lineage, and data access governance. ● Experience working with DevOps, Security, and Architecture teams to align on infrastructure and platform requirements. ● Familiarity with Agile/Scrum methodologies, Jira/Confluence, and project tracking tools. ● Excellent communication, stakeholder management, and leadership skills. Preferred Qualifications: ● Experience with GCP-native data services (BigQuery, Dataflow, Dataproc, Pub/Sub). ● Working knowledge of IAM models , RBAC/ABAC, and cloud-native security controls. ● Certification in cloud platforms (GCP, AWS, or Azure) or PMP/CSM. ● Exposure to DataOps , CI/CD pipelines , and infrastructure-as-code tools (e.g., Terraform). Thanks & Regards Prashant Awasthi Vastika Technologies PVT LTD 9711189829

Posted 2 months ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Pune

Hybrid

Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality

Posted 2 months ago

Apply

6.0 - 12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Requirements Role/ Job Title: Senior Data Analyst-Data Governance Business: Data & Analytics Function/ Department: Data & Analytics Place of Work: Mumbai Job Purpose Senior data Analyst (DG) will work within Data & Analytics Office to implement data governance framework with a focus on improvement of data quality, standards, metrics, processes. Align data management practices with regulatory requirements. Understanding of lineage – How the data is produced, managed and consumed within the Banks business process and system. Roles & Responsibilities Demonstrate Strong understanding of data governance, data quality, data lineage and metadata management concepts Participate in the data quality governance framework design and optimization, including process, standards, rules etc. Design and implement data quality rules and Monitoring mechanism Analyze data quality issue and collaborate with business stakeholders to address the issue resolution, Build recovery model across Enterprise. knowledge of DG technologies for data quality and metadata management (Ovaledge, Talend, Collibra etc) Support in development of Centralized Metadata repositories (Business glossary, technical metadata etc), Captures business/Data quality rules and design DQ reports & Dashboards Improve data literacy among the stakeholders. Minimum 6 to 12 years of experience in Data governance with Banking Domain preferable. Key Success Metrics Successful implementation of DQ framework across business line Successful manage Metadata management program.

Posted 2 months ago

Apply

3.0 - 6.0 years

5 - 12 Lacs

Mumbai

Work from Office

Roles and Responsibilities Develop data models using Collibra, Cataloguing, and SQL to support business requirements. Design and implement data analysis solutions using Python, Power BI, Tableau for private equity investment banking clients. Collaborate with cross-functional teams to identify opportunities for process improvements through data insights. Manage large datasets by creating ETL processes and ensuring data quality through regular checks. Provide technical guidance on best practices for data management and analytics tools. Desired Candidate Profile: 3-6 years of experience in a similar role within Private Equity or Investment Banking industry. Strong proficiency in Python programming language along with expertise in cataloguing and SQL querying. Experience working with popular data visualization tools like Power BI or Tableau is essential.

Posted 2 months ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Manager Senior Data Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Our IT team operates as a business partner proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver the services and solutions that help everyone to be more productive and enable innovation. The candidate will work with a globally diverse set of teams which includes SAP Basis, Security, ABAP, SAP functional team members, Infrastructure team and other IT process partners providing support for existing and new initiatives. The candidate will work closely with and advise the SAP Technical Architect on architectural topics and new applications / technologies to be integrated. The candidate will lead some cross-functional projects, relied upon to answer complex questions, and assists with program-wide initiatives. Our organization is on a transformation journey and we envision using newer SAP technologies and infrastructure as part of this transformation and the candidate must have exposure to these new technologies. It is expected that the candidate will be able to both lead technical initiatives and be hands-on. Responsibilities Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 5+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 07/22/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R350689

Posted 2 months ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Manager senior data engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Responsibilities Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 3+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business, Business, Business Intelligence (BI), Business Management, Contractor Management, Cost Reduction, Database Administration, Database Optimization, Data Engineering, Data Flows, Data Infrastructure, Data Management, Data Modeling, Data Optimization, Data Quality, Data Visualization, Design Applications, Information Management, Management Process, Operating Cost Reduction, Productivity Improvements, Project Engineering, Social Collaboration, Software Development, Software Development Life Cycle (SDLC) {+ 1 more} Preferred Skills Job Posting End Date 08/20/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R350684

Posted 2 months ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Manager Senior Data Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Our IT team operates as a business partner proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver the services and solutions that help everyone to be more productive and enable innovation. The candidate will work with a globally diverse set of teams which includes SAP Basis, Security, ABAP, SAP functional team members, Infrastructure team and other IT process partners providing support for existing and new initiatives. The candidate will work closely with and advise the SAP Technical Architect on architectural topics and new applications / technologies to be integrated. The candidate will lead some cross-functional projects, relied upon to answer complex questions, and assists with program-wide initiatives. Our organization is on a transformation journey and we envision using newer SAP technologies and infrastructure as part of this transformation and the candidate must have exposure to these new technologies. It is expected that the candidate will be able to both lead technical initiatives and be hands-on. Responsibilities Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 3+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business, Business, Business Intelligence (BI), Business Management, Contractor Management, Cost Reduction, Database Administration, Database Optimization, Data Engineering, Data Flows, Data Infrastructure, Data Management, Data Modeling, Data Optimization, Data Quality, Data Visualization, Design Applications, Information Management, Management Process, Operating Cost Reduction, Productivity Improvements, Project Engineering, Social Collaboration, Software Development, Software Development Life Cycle (SDLC) {+ 1 more} Preferred Skills Job Posting End Date 07/22/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R350685

Posted 2 months ago

Apply

4.0 - 10.0 years

0 Lacs

India

On-site

Job Description: As a Senior Data Governance Engineer, you will play a crucial role in the development and implementation of our data governance architecture & strategy. Mandatory Skills: ETL, Azure Databricks, Pyspark or Python, Data modeling, Data governance You will work closely with cross functional teams to ensure the integrity, quality, and security of our data assets. Your expertise in various Data Governance tools and custom implementations will be pivotal in driving our data governance initiatives forward. Key areas of expertise include ● Implement end to end data governance in medium to large sized data projects. ● Implement, configure, and maintain Data Governance tools such as Collibra, Apache Atlas, Microsoft PurView, BigID ● Evaluate and recommend appropriate DG tools and technologies based on business requirements. ● Define, implement, and monitor data quality rules and standards. ● Collaborate with data stewards, IT, legal, and business units to establish data governance processes. ● Provide guidance and support to data stewards. ● Work with business units to define, develop, and maintain business glossaries ● Ensure compliance with regulatory requirements and internal data governance frameworks. ● Collaborate with IT, data management teams, and business units to ensure alignment of data governance objectives. ● Communicate data governance initiatives and policies effectively across the organization. Qualifications: ● Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Management, or a related field. ● 4- 10 years of experience in data governance, data management, or a related field. ● Proven experience with Data Governance tools such as Collibra, Apache Atlas, Microsoft PurView, BigID and end to end data governance implementations. ● Experience with Cloud data quality monitoring and management ● Proficiency with cloud-native data services and tools on Azure and AWS ● Strong understanding of data quality principles and experience in defining and implementing data quality rules. ● Experience in implementing & monitoring data quality remediation workflows to address data quality issues. ● Experience serving in a data steward role with a thorough understanding of data stewardship responsibilities. ● Demonstrated experience in defining and maintaining business glossaries. ● Excellent analytical, problem solving, and organizational skills. ● Strong communication and interpersonal skills, with the ability to work effectively with cross functional teams. ● Knowledge of regulatory requirements related to data governance is a plus

Posted 2 months ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Pune

Hybrid

Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality

Posted 2 months ago

Apply

1.0 - 3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 months ago

Apply

1.0 - 3.0 years

0 Lacs

Gurugram, Haryana, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 months ago

Apply

1.0 - 3.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 months ago

Apply

1.0 - 3.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies