Home
Jobs

323 Collibra Jobs - Page 9

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

TCS present an excellent opportunity for GCP Data Engineer Job Location: Pan India Experience required: 3-15 Yrs Skills : GCP Data Engineer JOB Description: Must to have skills ● Experience with Google Cloud Platform Management, Big Query, Cloud SQL and ability to admin Google Cloud Solutions ● Experience in , Collibra and Atacama ● Provide resolution for many critical issues ● Worked in end to end Development projects. ● Exposure in Airflow, Composer, BigQuery, APIs, Cloud Functions and GitHub ● Strong analytical skills to comprehend business requirements using Python ● Strong SQL Skills. ● Strong communication skills and stakeholder engagement abilities. Key Responsibilities ● Should have excellent communication and client-facing skills/experience to compliment technical skills/experience ● Understand the requirements, able to design and develop solutions independently ● Coordinating with the SSE team to complete the analysis, design other phases. ● Requirement Gathering, Analysis, Design, and finalization of to-be processes, then creating appropriate solution design and finalizing architecture using Google Cloud Platform ● Involved in Review of projects artifacts and client interfacing. ● Should have experience in data migration projects involving GCP ● Hands on experience on scheduler tools, JIRA ● Extensive experience in developing SQL scripts ● Knowledge of GCP architecture ● Experienced in performing ETL operations through scripts and terminal commands. ● Teaming up to bring Organizational Data Quality and Collaboration Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Requirements Description and Requirements Position Summary Resource is responsible for managing (BMC Defender & Mainview , Cubus , Product Builder and HP PX Calculator ) which includes implementation, configuration, maintenance, upgrades, patches and administration, performance tuning, identify repetitive tasks and automate. Job Responsibilities Good experience in applications builds in both windows and Linux infrastructure and able to lead project. Good knowledge in performance tunings Strong collaboration with team members Capable of coaching up new team members Participate in cross-departmental efforts Leads initiatives within the community of practice Good decision-making skills. Take ownership of the deliverables from the entire team. Should be able to generate and present reports to leadership Coach other team members and bring them up to speed. Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills And Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in applications build in both windows and Linux infrastructure and able to lead project BMC Main view BMC Defender Trillium Linux Administration Network DNS and Domain Setup Web Server Scripting Ansible Azure Pipelines Ansible Windows Debugging Elastic Experience in creating change tickets and working on tasks in Service Now Good to Have : Python Collibra Data Governance -Edge (K3s application) BMC CorreLog SIEM Agent/Server for IBM zOS Application Debugging Azure DevOps About MetLife Recognized on Fortune magazine's list of the 2024 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Key Responsibilities Collibra Implementation & Management: Administer and optimize Collibra for insurance data governance, ensuring seamless integration with underwriting, claims, and policy administration systems. Regulatory Compliance: Align data governance strategies with insurance regulations, including NAIC, GDPR, and CCPA, ensuring data privacy and security. Data Quality & Integrity: Develop frameworks for risk assessment, fraud prevention, and actuarial data management to enhance operational efficiency. Metadata & Lineage Management: Establish processes for tracking data flow across policy issuance, claims processing, and financial reporting. Stakeholder Collaboration: Work closely with actuaries, underwriters, claims analysts, and IT teams to standardize data governance policies. Risk & Analytics Integration: Ensure governance supports AI-driven predictive modeling, risk analytics, and loss mitigation strategies. Training & Documentation: Educate employees on best practices in insurance-specific data stewardship and compliance. Qualifications Bachelor’s degree in business, Information Technology, Data Analytics, or a related field (Master’s degree preferred). Experience in P&C insurance data governance, risk management, and regulatory compliance. Strong expertise in Collibra, metadata management, and data lineage tracking. Familiarity with insurance underwriting, claims processing, and actuarial data analysis. Knowledge of data privacy laws and security protocols (SOX, GDPR, CCPA). Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python Database: SQL, PL/SQL, Postgres SQL, Bigquery, Stored Procedure / Routines. ETL & Integration: AecorSoft, Talend, DBT, Databricks (Optional), Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS. Supply Chain: IMS + Shipment functional knowledge good to have. Supporting Technologies: Erwin, Collibra, Data Governance, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the company and our customers across the globe. Position Summary: Functional Data Stewardship Manager: (Manager, Data Stewardship_G4_EDAA0075) As a Functional Data Stewardship Manager, you will be crucial in implementing McDonald's data governance strategy across various business functions. This position ensures that enterprise data is reliable and clearly defined, aligning with the organization’s business priorities and compliance standards. You will collaborate with global data owners and stewards to enforce governance policies, maintain data quality, and serve as a subject matter expert on functional data assets. This role combines strategic thinking with hands-on stewardship, guaranteeing that high-quality data is available to support essential processes and analytics throughout the organization. Who we’re looking for: Primary Responsibilities: Functional Data Domain Leadership : Act as the primary data steward for assigned functional domains (e.g., HR, Finance, Marketing), maintaining oversight of data definitions, ownership, lineage, and business rules. Policy Execution & Standards Alignment : Collaborate with data governance leads to implement enterprise data governance policies, standards, and classification frameworks into day-to-day operations. Issue Resolution & Continuous Improvement : Proactively identify data issues and inconsistencies, partner with business and technical teams to remediate, and provide feedback into strategic governance forums for policy and tool enhancement. Additional Responsibilities: Drive adoption of governance practices and act as a data SME for business and technology teams within your domain. Document and socialize functional data flows and lifecycle processes, ensuring accuracy and traceability. Champion data quality by developing data certification processes and monitoring data health metrics. Collaborate with global markets and enterprise teams to ensure functional data consistency and stewardship scalability. Maintain expertise in regulatory compliance, security, and privacy standards relevant to your domain. Mandatory Skill: Bachelor's degree in Information Technology, Data Science, Business, or a related field 5+ years of experience in data stewardship, governance, or data management roles Experience with HR or Finance business areas’ data requirements, usage, sources, reports, and metrics. Strong expertise into Collibra / Reltio Success Factors/ Oracle – Combination of any 2 skills. Strong understanding of business processes and how data flows across systems Experience defining and maintaining business glossaries and metadata standards Able to build relationships with stakeholders and conduct information gathering sessions with business and technical SMEs Familiarity with enterprise tools such as Collibra, Reltio Success Factors, or Oracle Strong analytical, documentation, and stakeholder engagement skills Able to review and validate English language metric calculation descriptions at a detailed level. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Senior Data Quality Assurance Analyst Career Level :: D Introduction to role: Are you ready to play a pivotal role in transforming data quality within a global pharmaceutical environment? Join our Operations Global Data Office as a Senior Data Quality Assurance Analyst, where you'll be instrumental in developing solutions to enhance data quality within our SAP systems. Your expertise will drive the creation of data quality rules and dashboards, ensuring alignment to standards and governance policies. Collaborate with data stewards and business owners to develop code for monitoring and measuring data quality, focusing on root cause analysis and prevention of data issues. Are you solution-oriented and passionate about testing? This is your chance to make a significant impact! Accountabilities: Develop and support the creation of data quality dashboards in Power BI by extracting data from various Global SAP systems into Snowflake. Work extensively with collaborators to define requirements for continuous data quality monitoring. Provide extensive data analysis and profiling across a wide range of data objects. Develop and implement the data quality framework and operating model. Focus on high levels of process automation to ensure data and results are up-to-date. Conduct extensive data analysis to detect incorrect patterns in critical data early. Facilitate matching or linking multiple data sources together for continuous DQ monitoring. Embed ongoing data quality monitoring by setting up mechanisms to track issues and trends. Conduct root cause analysis to understand causes of poor quality data. Train, coach, and support data owners and stewards in managing data quality. Essential Skills/Experience: Experience developing and supporting the creation of data quality dashboards in Power BI, by extracting data from various Global SAP systems into Snowflake and develop rules for identifying DQ issues using Acceldata or something similar. Demonstrated experience & domain expertise within data management disciplines, including the three pillars of data quality, data governance and data architecture. Advanced programming skills in T-SQL or similar, to support data quality rule creation. Advanced data profiling and analysis skills evidenced by use of at least one data profiling analysis tool. For example, Adera, DataIKU or Acceldata. Strong ETL automation and reconciliation experience. Expert in extracting and manipulating and joining data in all its various formats. Excellent visualizing experience, using Power BI or similar for monitoring and reporting data quality issues. Key aspect of the role is to create self-serve data quality dashboards for the business to use for defect remediation and trending. Excellent written and verbal communication skills with the ability to influence others. to achieve objectives. Experience in Snowflake or similar for data lakes. Strong desire to improve the quality of data and to identify the causes impacting good data quality. Experience of Business and IT partnering for the implementation of Data Quality KPIs and visualisations. Strong Team member management skills with a good attention to detail. Ability to work in fast-paced, dynamic environment and manage multiple streams of work simultaneously. Experience of working in a global organisation, preferably within the pharmaceuticals industry. Experience of working in global change projects. Extensive knowledge of data quality with the ability to develop and mature the data quality operating model and framework. Knowledge of at least one standard data quality tool. For example, Acceldata, Alteryx, Aperture, Trillium, Ataccama or SAS Viya. Desirable Skills/Experience: Using one of the following data lineage or governance tools or similar. For example, Talend or Collibra. Experience in working in a complex MDG SAP data environment. Experience of any of the following for data cleansing – Winshuttle or Aperture Working within a lean environment and knowledge of data governance methodologies and standards. Knowledge of automation and scheduling tools. Extensive knowledge of risk and data compliance. Experience in data observability using AI pattern detection. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, innovation is at the heart of everything we do. We embrace change by trialing new solutions with patients and business needs in mind. Our diverse workforce is united by curiosity, sharing findings, and scaling fast. Be part of a digitally-enabled journey that impacts society, the planet, and our business by delivering life-changing medicines. Ready to make a difference? Apply now! Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Total Yrs. of Experience* 10+ Relevant Yrs. of experience* 8+ Detailed JD *(Roles a nd Responsibilities) Lead Consultant An experience Big data professional who will technically lead the team in design and development Responsible for team delivery Mandatory skills* HDFS, Ozone, Hive, Impala, Spark, Atlas, Ranger Kafka, Flink, Spark Streaming Python/PySpark Excellent communication Experienced in design of data landscape Desired skills* GraphQL, Venafi (Certificate Mgt), Collibra, Azure DevOps. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Position Type Full time Type Of Hire Experienced (relevant combo of work and education) Education Desired Bachelor of Business Administration Travel Percentage 0% Are you curious, motivated, and forward-thinking? At FIS you’ll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun. About The Team Corporate Compliance operates within the 2nd Line of Defence and is a part of the FIS RISC organization. Our mission is to promote ethical business practices and to support the Company and its employees in understanding and complying with the laws and regulations applicable to our business, products and services. Led by the Chief Compliance officer, we help the FIS LOBs remain compliant in a fast-changing regulatory environment, we assess controls in place to ensure compliance, and we lead enterprise wide processes to manage certain regulatory risks such as Privacy, AML, and Code of Ethics compliance. What You Will Be Doing Support in writing documentation including data governance workflows, standards, guidelines, taxonomies, operating procedures, and corporate data policies. Develop and optimize complex SQL queries to extract, manipulate, and analyze large volumes of financial data. Ensure data accuracy and integrity by validating and reconciling data across various sources using SQL. Create and maintain comprehensive data lineage documentation to track the flow and transformation of financial data throughout the organization. Collaborate with cross-functional teams of Business Subject Matter Experts (SMEs), Business Data Stewards and Technical Data Stewards to establish and enforce data quality standards, metadata management, and data stewardship. Generate regular reporting on Financial Crimes Data Governance KPIs, metrics, and activities. Track and validate product data compliance deficiencies to completion. Streamline process, particularly technological solutions in the area of AML compliance. Support in the design of and taking day-to-day responsibility for the management of major financial crime and sanctions projects (e.g. introduction of new technological solutions and processes. Assist in the design, testing, tuning and implementation of strategic initiatives in the areas of financial crime and sanctions, particularly Transaction Monitoring and Sanctions Screening. Assist with the day-to-day aspects of the process for ongoing screening using the Unit 21 software, including helping the Financial Crime & Sanctions teams in refining and further developing the use of that technology. Provide regular reports and updates of team workload to BAU team management. Be flexible in undertaking the above responsibilities and any others so required. What You Bring Bachelor’s or Master’s degree in a relevant field (e.g., Computer Science, Statistics, Engineering) AML/BSA and OFAC laws and regulations as well having a thorough understanding and experience of how to apply such requirements in a business Proven experience as a Technology and Data analyst in the financial services industry, preferably with a focus on Financial Crimes Compliance Experience with data governance as a practice, including enterprise data management, data quality, data lineage, meta data management, data methodologies, data analytics and reporting, and the ethical use of data and artificial intelligence (AI) Strong proficiency in SQL with the ability to write and optimize complex queries Data analysis skills with experience using enterprise tools such as Informatica IDQ, Informatica MDM, Informatica Enterprise Data Catalog, Collibra, DOMO, or other metadata management tools Proficiency in data analysis tools (e.g., Python, R, or similar) Excellent analytical and problem-solving skills Strong attention to detail and commitment to data accuracy Ability to build relationships and credibility across the organization Effective communication and collaboration skills Comprehensive knowledge of at least one substantive area of compliance, such as banking regulations or anti-money laundering What We Offer You A modern, international work environment and a dedicated and motivated team A competitive salary and benefits The chance to work on some of the most challenging, relevant issues in financial services & technology Great workspaces with dedicated and motivated colleagues A work environment built on collaboration, flexibility, and respect Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice . Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass Show more Show less

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Position Type Full time Type Of Hire Experienced (relevant combo of work and education) Education Desired Bachelor of Business Administration Travel Percentage 0% As the world works and lives faster, FIS is leading the way. Our fintech solutions touch nearly every market, company and person on the planet. Our teams are inclusive and diverse. Our colleagues work together and celebrate together. If you want to advance the world of fintech, we’d like to ask you~ Are you FIS? About The Role This role will support the Sanctions function of the Financial Crimes program and support the Senior Director of the program to establish a strong tech and data knowledge as well as support the various system implementation across Global Financial Crimes Compliance. About The Team The Global Financial Crimes (GFC) Sanctions team is responsible for the oversight of the Sanctions compliance function across FIS, including its subsidiaries and affiliates. The Sanctions team’s key responsibilities include compliance with sanctions law, Risk Assessments, Committees and Forums, Training, Technology Governance, and Reporting for all matters impacting Sanctions. What You Will Be Doing Write documentation including sanctions workflows, standards, guidelines, testing procedures, taxonomies, and operating procedures Develop and optimize complex SQL queries to extract, manipulate, and analyze large volumes of financial data. Ensure data accuracy and integrity by validating and reconciling data across various sources using SQL Create and maintain comprehensive data lineage documentation to track the flow and transformation of financial data throughout the organization Contribute to the development and maintenance of master data management processes to ensure consistency and accuracy of critical data elements Generate regular reporting on Financial Crimes Data Governance KPIs, metrics, and activities Maintain knowledge of applicable laws and regulations Perform periodic reviews and evaluations of FIS products, services and business activities to validate compliance with applicable laws and regulations or detect regulatory violations, weak controls or other potential areas of exposure pertaining to data Monitor LOB compliance activities to verify that regulatory compliance deadlines and requirements are met Track and validate product data compliance deficiencies to completion Participate in development of compliance sessions/presentations What You Will Need Bachelor’s or master’s degree in a relevant field (e.g., Computer Science, Statistics, Engineering) Typically requires 1-3 years of experience in the regulatory compliance field Proven experience as a Data Analyst in the financial services industry, preferably with a focus on Financial Crimes Compliance Experience with data governance as a practice, including enterprise data management, data quality, data lineage, meta data management, data methodologies, data analytics and reporting, and the ethical use of data and artificial intelligence (AI) Strong proficiency in SQL with the ability to write and optimize complex queries Data analysis skills with experience using enterprise tools such as Informatica IDQ, Informatica MDM, Informatica Enterprise Data Catalog, Collibra, or other metadata management tools Proficiency in data analysis tools (e.g., Python, R, or similar) Excellent analytical and problem-solving skills Strong attention to detail and commitment to data accuracy Ability to build relationships and credibility across the organization Effective communication and collaboration skills Added Bonus If You Have Experience for regulatory oversight of core or significant high risk product lines that contain complex banking functions Considered a subject matter expert in sanctions regulatory compliance, including a combination of banking regulations What We Offer You At FIS, you can learn, grow and make an impact in your career. Our benefits include~ Flexible and creative work environment Diverse and collaborative atmosphere Professional and personal development resources Opportunities to volunteer and support charities Competitive salary and benefits Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice . Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass Show more Show less

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Location : Bengaluru (Hybrid) / Remote Job Type : Full-time Experience Required : 5+ Year Notice Period : immediate -30 days Role Overview : As a Collibra Expert, you will be responsible for implementing, maintaining, and optimizing the Collibra Data Governance Platform to ensure data quality, governance, and lineage across the organization. You will partner with cross-functional teams to develop data management strategies and integrate Collibra solutions with Google Cloud Platform (GCP) to create a robust, scalable, and efficient data governance framework for the retail domain. Key Responsibilities : - Data Governance Management : Design, implement, and manage the Collibra Data Governance Platform for data cataloging, data quality, and data lineage within the retail domain. - Collibra Expertise : Utilize Collibra for metadata management, data quality monitoring, policy enforcement, and data stewardship across various business units. - Data Cataloging : Lead the implementation and continuous improvement of data cataloging processes to enable a centralized, user-friendly view of the organization's data assets. - Data Quality Management : Collaborate with business and technical teams to ensure that data is high-quality, accessible, and actionable. Define data quality rules and KPIs to monitor data accuracy, completeness, consistency, and timeliness. - Data Lineage Implementation : Build and maintain comprehensive data lineage models to visualize the flow of data from source to consumption, ensuring compliance with data governance standards. - GCP Integration : Architect and implement seamless integrations between Collibra and the Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, and Cloud Storage, ensuring data governance policies are enforced in the cloud environment. - Collaboration & Stakeholder Management : Collaborate with Data Engineers, Analysts, Business Intelligence teams, and leadership to define and implement data governance best practices and standards. - Training & Support : Provide ongoing training and support to business users and technical teams on data governance practices, Collibra platform usage, and GCP-based solutions. - Compliance & Security : Ensure data governance initiatives comply with internal policies, industry standards, and regulations (e.g., GDPR, CCPA). Key Requirements : - Proven Expertise in Collibra : Hands-on experience implementing and managing Collibra Data Governance Platform (cataloging, lineage, data quality). - Google Cloud Platform (GCP) Proficiency : Strong experience with GCP tools (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.) and integrating them with Collibra for seamless data governance. - Data Quality and Lineage Expertise : In-depth knowledge of data quality frameworks, metadata management, and data lineage implementation. - Retail Industry Experience : Prior experience in data governance within the retail or eCommerce domain is a plus. - Technical Skills : Strong understanding of cloud data architecture and best practices for managing data at scale in the cloud (preferably in GCP). - Problem-Solving and Analytical Skills : Ability to analyze complex data governance issues and find practical solutions to ensure high-quality data management across the organization. - Excellent Communication Skills : Ability to communicate effectively with both technical and non-technical stakeholders to advocate for data governance best practices. - Certifications : Relevant certifications in Collibra, Google Cloud, or Data Governance are highly desirable. Education & Experience : - Bachelor's degree (B. Tech/BE) mandatory, masters optional - 5+ years of experience in Data Governance, with at least 3 years of specialized experience in Collibra and GCP. - Experience working with data teams in a retail environment is a plus.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

12 - 15 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

DQ,DL,DM, Data Analysis, Data Mapping , Collibra Introduction: As a Data Lineage Analyst, you will play a crucial role in enhancing the transparency and understanding of data flow within our organization. This position is ideal for someone who is passionate about data management, data quality, and governance and seeks to leverage these skills to drive meaningful insights that influence strategic decisions. Key Responsibilities: Analyze and Document Data Lineage: Track data elements from their origin through various transformations to their final form, documenting the data flow and any processes that impact data. Improve Data Quality: Identify areas where data quality can be improved and collaborate with data governance and IT teams to implement enhancements. Collaborate on Data Management Policies: Work with data governance teams to develop guidelines and policies that ensure data accuracy and consistency across the organization. Stakeholder Engagement: Regularly engage with business and technical stakeholders to gather requirements and provide updates on data lineage projects. Tool Implementation: Utilize and help to implement tools and technologies designed to automate and facilitate data lineage tracking and reporting. Data Mapping and Metadata Management: Create and maintain mappings of data sources, transformations, and consumers, along with metadata to support data lineage. Troubleshoot Data Issues: Identify and resolve issues related to data flow and lineage in collaboration with data engineering teams. Qualifications: Bachelors degree in information technology, Computer Science, Data Science, or a related field. Proven experience in a data-focused role, preferably with specific experience in data lineage, data analysis, or data governance.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

18 - 33 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: We are looking for a detail-oriented Infrastructure Administrator with proven expertise in managing enterprise IT tools and cloud platforms. You will be responsible for setting up, configuring, and maintaining key systems like Tableau, AWS Resources, SAP BO, Collibra, SAS, and Alteryx. Key Responsibilities: Manage administration of Tableau Server, SAP BO, Alteryx Server, and AWS Services. Handle access provisioning, license management, and user governance for Collibra and Power Designer. Oversee system stability, performance optimization, and backup strategies. Monitor tool usage, patch versions, and ensure software compliance. Collaborate with IT and Data Governance teams for secure infrastructure practices. Must-Have Skills : Experience administering Tableau Server , including user permissions and dashboards Proficiency in managing AWS Resources (IAM roles, EC2, S3, monitoring tools) Strong working knowledge of Collibra and Information Steward for data governance Experience managing SAS , Alteryx Server , and other enterprise data tools Nice-to-Have Skills : Familiarity with Power Designer , Datagrip , and metadata repository tools Knowledge of enterprise security policies , auditing, and patch management Experience in regulated sectors like insurance, banking domains

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Engineer - Data Solutions Delivery + Data Catalog & Quality Engineer About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory, and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012 Show more Show less

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Punjab

On-site

Indeed logo

Location : Mohali City : Mohali State : Punjab (IN-PB) Country : India (IN) Requisition Number : 39025 Business Title- Manager - Data Governance and Data Quality Global Job Title-Mgr I Strategy & Trans Global Function-Business Services Global Department-Strategy and Transformation Reporting to Data Governance Lead Role Purpose Statement-The person will be part of our Data Governance team and will help drive the successful implementation and adoption of the Collibra Data Governance platform including Collibra Data quality. The person need to understand the Collibra Meta Model: Asset, Domain, Community, metadata ingestions using templates. Main Accountabilities- 8-10 Years of Experience in Data Governance and Data Quality Good hands-on experience in Collibra stack of tools such as Collibra DIC and DQ. Establish data quality, data lineage, and metadata management processes within Collibra. Exposure to GCP, Data Privacy, Data Domains, API’s Monitor and report on data governance metrics and KPIs, identifying areas for improvement and implementing corrective actions. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams." Knowledge and Skills- Behavior-"Use knowledge of Bunge’s business, structure and strategy to develop innovative solutions to improve results or eliminate problems. Build partnerships, appropriately influence to gain commitment, foster talent and coach others to grow in current or future roles. Drive results through high standards, focus on key priorities, organization, and preparing others for change." Technical-"• Accept responsibility/accountability for assigned tasks Break problems into manageable pieces and follow an organized approach to resolve them Plan tasks to create deliverables and effectively execute that plan with little direction from supervisor Deliver high-quality results on time Show willingness and ability to increase contribution and level of responsibility and proactively seek to do so." Education & Experience- "• Education: Bachelor of Engineering /Master of Computer Science/ Master of Science from premier institutes Skills: Collibra stack of tools (DIC, DQ) Data Warehouse / Data Modeling, ETL" Bunge (NYSE: BG) is a world leader in sourcing, processing and supplying oilseed and grain products and ingredients. Founded in 1818, Bunge’s expansive network feeds and fuels a growing world, creating sustainable products and opportunities for more than 70,000 farmers and the consumers they serve across the globe. The company is headquartered in St. Louis, Missouri and has 25,000 employees worldwide who stand behind more than 350 port terminals, oilseed processing plants, grain facilities, and food and ingredient production and packaging facilities around the world. Bunge is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, gender expression, transgender status, national origin, citizenship, age, disability or military or veteran status, or any other legally protected status. Bunge is an Equal Opportunity Employer. Minorities/Women/Veterans/Disabled

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Data Analyst, GSA Data Divisional Office – Risk, Finance and Treasury Location: Mumbai, India Corporate Title: AVP Role Description As a member of the Group Chief Operation Office Divisional Data Office (GCOO DDO) in Group Strategic Analytics (GSA), you will be part of the team responsible for the data strategy across these business domains. Your responsibilities will focus on management of change-the-bank activities to uplift our data capabilities to meet recently revised regulatory and internal policies and standards. This is an exciting opportunity to collaborate with various groups who actively originate and consume data, including business lines, infrastructure functions and large-scale change programmes. As part of your role, you will gain a thorough understanding of how data is an integral component of our business, with oversight of GCOO data assets, the enterprise data management lifecycle and upholding expected standards to maintain global Regulatory compliance. Deutsche Bank is investing heavily in optimizing our business processes and regulatory outcomes by using data in the best ways possible, and you will be directly shaping the strategy to do so. Mandatory Text – Do not delete Group Strategic Analytics is part of Group Chief Operation Office (GCOO) which acts as the bridge between the Bank’s businesses and infrastructure functions to help deliver the efficiency, control, and transformation goals of the Bank. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Assigned to several data strategy projects including related governance and KPIs Key GCOO DDO Data Analyst / Point of Contact of relevant Change The Bank and Run The Bank data priorities in partnership with Divisional & Regional Stakeholders. Monitor data management related regulatory changes and gap analysis to DB processes. Drive implementation of Data Quality Control Framework to ensure completeness / accuracy / timeliness for the GCOO mandated scope of data, ensuring compliance with Strategic Data KPIs Identify most critical and strategic data to be brought under governance and facilitate right sourcing via strategic platforms Support relationships with all relevant internal and external stakeholder groups, representing the GCOO data strategy and DDO function. Your Skills And Experience 10+ years of experience in Banking, ideally data-management related roles Experience of managing complex change programmes (ideally cross-divisional and cross-location) Data analysis: ability to investigate and present details of lineage, completeness, and transformations performed upon data via flows and processes Ideally experience in the usage of Industry standard data management tools such as Collibra and Solidatus Demonstrable experience in translating Core Data Standards into practical implementation. Excellent organizational skills and a high attention to detail with the ability to work under pressure and to deliver to tight deadlines. Excellent interpersonal skills with demonstrable ability to engage and influence stakeholders. Strong communication skills, both written and verbal, ability to explain complex problem in a clear and structure way. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job Role & responsibilities:- Understanding operational needs by collaborating with specialized teams • Supporting key business operations. This involves architecture data flow, data lineage and building data systems Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, lineage, data deduplication, mapping and transformation, and business validations. Own development and management of data glossaries and data owner matrices to establish enterprise data standards pertaining to the use of critical data. Assist with deploying data issue capture and resolution process. Engage with key business stakeholders to assist with establishing fundamental data governance processes Create, prepare, and standardize data quality reports for internal analysis. Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards. Technical skills ,Experience & Qualification required:- 4-10 years of experience in Data Governance Experience who is proficient in data management, understands data model frameworks, and has practical knowledge of MDM Hands on experience in Collibra tool Hands-on experience on working with data catalog tool as Collibra. Hands on experience on Collibra, Data governance and Data quality aspects; working with python Understanding of Cloud ServicesAzure Good communication and interpersonal skills Bachelors Degree in Computer Science or related field Soft skills and competencies: - Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Only Immediate Joiners will be preferred.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

9 - 15 Lacs

Pune

Work from Office

Naukri logo

We need a strong business analyst who can work independently and manage a small team of people. They need to have a background working in financial domain, and ideally a background in data management. Requirements Experience leading teams of Business Analysts/ Data Analysts. Experience of working as a Business Analyst for a product. Experience in leading the solution designing and discussions. Experience with SQL, analysing large data sets, API's. Experience in understanding ETL workflows and coming up with BRD's and FRDs EXPERTISE AND QUALIFICATIONS 3 to 5 years of experience working as a business analyst within the Financial Sector (ideally into Banking) Understanding of risk, finance and banking processes. Ability to understand business processes, analyze as-is processes, develop new to-be processes in order to drive operational efficiency. Read and interpret data models, ETL processes, strong Data Analysis is a must Strong relationship management skills, consultative skills and client-facing experience. Work independently on small to large and complex projects without the need for supervision. Provide honest assessment to stakeholders of the expected delivery date of the problem resolution to ensure consistency with their expectations.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Accessibility Lead:(Sr Manager, Data Operations & Management_G5_EDAA0488_EDAA0489) As the Data Accessibility Lead, you will drive the enterprise-wide strategy for enabling secure, governed, and scalable access to data for AI/ML, analytics, and business operations. You will lead cross-functional teams responsible for managing data lifecycle, enforcing data quality standards, and implementing modern governance tooling such as Collibra. This role is pivotal to operationalizing data accessibility across cloud platforms like GCP and AWS, including BigQuery, Redshift, and other core data infrastructure. Who we’re looking for: Primary Responsibilities: Strategic Data Accessibility Leadership Set the strategic direction for enterprise data accessibility, ensuring consistent and secure access across teams and platforms. Lead the implementation and adoption of data governance tools (e.g., Collibra ) to manage metadata, lineage, and data policies. Champion enterprise adoption of semantic and technical metadata practices for improved discoverability and data use. AI/ML Enablement Oversee the availability, quality, and governance of data used for AI/ML model development and lifecycle management . Ensure that model training, validation, and deployment pipelines have reliable and timely access to governed datasets. Partner with MLOps, engineering, and product teams to embed data accessibility standards in model workflows. Cloud Platform Integration Oversee data accessibility initiatives in GCP and AWS , including integration with BigQuery , Redshift , and cloud-native storage. Develop strategies for managing access controls, encryption, and auditability of data assets across cloud environments. Data Governance & Quality Oversight Define and enforce enterprise data quality standards , including data profiling, validation, and exception handling workflows. Ensure compliance with internal data policies and external regulations (e.g., GDPR, HIPAA, CCPA). Lead enterprise initiatives around data lifecycle management , from ingestion and processing to archival and retention. Cross-Functional Collaboration & Leadership Lead and mentor a team of data operations professionals and collaborate with data engineering, governance, AI/ML, and compliance teams. Provide executive-level insights and recommendations for improving enterprise data accessibility, quality, and governance practices. Drive alignment between business units, technical teams, and compliance functions through effective data stewardship. Skill: 8+ years of experience in data operations , data governance , or data quality management , with at least 3 years in a strategic leadership capacity. Strong hands-on and strategic experience with: Collibra or similar data governance platforms Cloud platforms : Google Cloud Platform (GCP), Amazon Web Services (AWS) Enterprise data warehouses such as Big Query , Redshift , or Snowflake AI/ML model lifecycle support and MLOps integration Data quality frameworks, metadata management, and data access policy enforcement SQL Strong analytical and problem-solving skills; ability to work across highly matrixed, global organizations. Exceptional communication, leadership, and stakeholder management skills. Bachelor’s or master’s degree in data science, Information Systems, or a related field. Preferred Experience: Experience in Retail or Quick Service Restaurant (QSR) environments with operational and real-time analytics needs. Familiarity with data mesh concepts, data product ownership, and domain-based accessibility strategies. Experience navigating privacy, residency, or regulatory compliance in global data environments. Current GCP Associates (or Professional) Certification. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, colour, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description: Responsibilities Requirements Management | Responsible for requirements management and the rough conception of the solutions assigned to them. In this context also responsible for the evaluation of the content and its functional requirements. Implementation responsibility for the application solutions | Responsible for the implementation and fulfillment of requirements of his solutions. Coordination of the corresponding work packages and coordination with the relevant professional and technical roles as well as the requesters. No responsibility for the content of the defined requirements. Application of architecture guidelines | Application of the adopted architecture guidelines for the overall architecture and the data lake. Support sharing a solution | Submits the completed solution to the Business Architect / board for approval and takes care of any adjustments and improvements. Participation in data governance | Consults the relevant data governance specialists when building the application/solution and ensures the implementation of the relevant data security and protection aspects as well as compliance with legal requirements. Procurement & commissioning of the infrastructure | Responsible for procuring the relevant infrastructure around the data lake. Also responsible for setting up and commissioning the necessary hardware. Support | Specialist for his solution and thus central point of contact and 2nd level support for power users / power users + his solution. Qualifications 8+ years of experience in Application Development using the following technologies HTML, CSS, JavaScript, JQuery, Bootstrap SQL, Stored Procedures, Triggers, Functions, Views Powershell or C# for Background Process Development Azure Cloud Platform Purview/Collibra or similar data catalog/governance application Experience with Low-Code/No-Code Platforms (like MS Power Apps), MS Power Automate, Visual Studio, Git, Azure DevOps is preferred Proficient in the Azure platform, including Azure Data Factory, Azure SQL Database, Azure Synapse, Azure Databricks, and Power BI. Good knowledge of data modeling, ETL processes, and data warehouse design principles. Very good technical know-how Good technology knowledge and data architecture know-how Basic knowledge of relevant data security and protection as well as legal requirements Proficient in project management, Use of own reports Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 weeks ago

Apply

15.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Engineer Lead - AEP Location:Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Currently Hiring for " Data Governance Analyst " in a leading Bank. location- Kolkata /Work from office Experience / Skillset / Certifications Required: Minimum of 5+ years of enterprise data governance, experience working with Data warehouse technologies and data governance solutions (ex. Data Catalog, MDM and Data Quality) Must have 3+ years of practical experience configuring business glossaries, dashboards, policies, search, Data maps 3+ years’ experience in Data Standardization, Cleanse, transform and parse data. Develop data standardization mapplets and mappings Good to have working knowledge of any of the Data Governance tools such as Informatica, Collibra etc. Good to have certification in DAMA, EDM Council, IQINT Good to have knowledge of AI/ML and their application in Data Governance If interested -share resume # bhumika@peoplemint.in Show more Show less

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

Hyderabad

Remote

Naukri logo

Job Title: Senior Data Engineer Location: Remote Job Type: Fulltime Experience Level: 7+ years About the Role: We are seeking a highly skilled Senior Data Engineer to join our team in building a modern data platform on AWS. You will play a key role in transitioning from legacy systems to a scalable, cloud-native architecture using technologies like Apache Iceberg, AWS Glue, Redshift, and Atlan for governance. This role requires hands-on experience across both legacy (e.g., Siebel, Talend, Informatica) and modern data stacks. Responsibilities: Design, develop, and optimize data pipelines and ETL/ELT workflows on AWS. Migrate legacy data solutions (Siebel, Talend, Informatica) to modern AWS-native services. Implement and manage a data lake architecture using Apache Iceberg and AWS Glue. Work with Redshift for data warehousing solutions including performance tuning and modelling. Apply data quality and observability practices using Soda or similar tools. Ensure data governance and metadata management using Atlan (or other tools like Collibra, Alation). Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions. Build scalable, secure, and high-performing data platforms supporting both batch and real-time use cases. Participate in defining and enforcing data engineering best practices. Required Qualifications: 7+ years of experience in data engineering and data pipeline development. Strong expertise with AWS services, especially Redshift, Glue, S3, and Athena. Proven experience with Apache Iceberg or similar open table formats (like Delta Lake or Hudi). Experience with legacy tools like Siebel, Talend, and Informatica. Knowledge of data governance tools like Atlan, Collibra, or Alation. Experience implementing data quality checks using Soda or equivalent. Strong SQL and Python skills; familiarity with Spark is a plus. Solid understanding of data modeling, data warehousing, and big data architectures. Strong problem-solving skills and the ability to work in an Agile environment.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Role As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer You’ll Be Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a “go-to” expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review Building Looker Dashboard for use cases if required What We Are Looking For You have 4+ years of extensive development experience using snowflake or similar data warehouse technology You have working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker You have experience in agile processes, such as SCRUM You have extensive experience in writing advanced SQL statements and performance tuning them You have experience in Data Ingestion techniques using custom or SAAS tool like fivetran You have experience in data modelling and can optimise existing/new data models You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets You have experience architecting analytical databases (in Data Mesh architecture) is added advantage You have experience working in agile cross-functional delivery team You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment You have strong technical documentation skills and the ability to be clear and precise with business users You have business-level of English and good communication skills You have basic understanding of various systems across the AWS platform ( Good to have ) Preferably, you have worked in a digitally native company, ideally fintech Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage Our Tech Stack DBT Snowflake Airflow Fivetran SQL Looker What You’ll Get In Return Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options Tidean Ways Of Working At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice . Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Template Job Title - Decision Science Practitioner Analyst S&C GN Management Level : Senior Analyst Location: Bangalore/ Kolkata Must have skills: Collibra Data Quality - data profiling, anomaly detection, reconciliation, data validation, Python, SQL Good to have skills: PySpark, Kubernetes, Docker, Git Job Summary: We are seeking a highly skilled and motivated Data Science cum Data Engineer Senior Analyst to lead innovative projects and drive impactful solutions in domains such as Consumer Tech , Enterprise Tech , and Semiconductors . This role combines hands-on technical expertise , and client delivery management to execute cutting-edge projects in data science & data engineering Key Responsibilities Data Science and Engineering Implement and manage end to end Data Quality frameworks using Collibra Data Quality (CDQ). This includes – requirement gathering from the client, code development on SQL, Unit testing, Client demos, User acceptance testing, documentation etc. Work extensively with business users, data analysts, and other stakeholders to understand data quality requirements and business use cases. Develop data validation, profiling, anomaly detection, and reconciliation processes. Write SQL queries for simple to complex data quality checks. Python, and PySpark scripts to support data transformation and data ingestion. Deploy and manage solutions on Kubernetes workloads for scalable execution. Maintain comprehensive technical documentation of Data Quality processes and implemented solutions. Work in an Agile environment, leveraging Jira for sprint planning and task management. Troubleshoot data quality issues and collaborate with engineering teams for resolution. Provide insights for continuous improvement in data governance and quality processes. Build and manage robust data pipelines using Pyspark and Python to read and write from databases such as Vertica and PostgreSQL. Optimize and maintain existing pipelines for performance and reliability. Build custom solutions using Python, including FastAPI applications and plugins for Collibra Data Quality. Oversee the infrastructure of the Collibra application in Kubernetes environment, perform upgrades when required, and troubleshoot and resolve any Kubernetes issues that may affect the application's operation. Deploy and manage solutions, optimize resources for deployments in Kubernetes, including writing YAML files and managing configurations Build and deploy Docker images for various use cases, ensuring efficient and reusable solutions. Collaboration and Training Communicate effectively with stakeholders to align technical implementations with business objectives. Provide training and guidance to stakeholders on Collibra Data Quality usage and help them build and implement data quality rules. Version Control and Documentation Use Git for version control to manage code and collaborate effectively. Document all implementations, including data quality workflows, data pipelines, and deployment processes, ensuring easy reference and knowledge sharing. Database and Data Model Optimization Design and optimize data models for efficient storage and retrieval. Required Qualifications Experience: 4+ years in data science Education: B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field Industry Knowledge: Preferred experience in Consumer Tech, Enterprise Tech, or Semiconductors but not mandatory Technical Skills Programming: Proficiency in Python, SQL for data analysis and transformation. Tools : Hands-on experience with Collibra Data Quality (CDQ) or similar Data Quality tools (e.g., Informatica DQ, Talend, Great Expectations, Ataccama, etc.). Experience working with Kubernetes workloads. Experience with Agile methodologies and task tracking using Jira. Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication, stakeholder management & requirement gathering capabilities. Additional Information: The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients This position is based at our Bengaluru (preferred) and Kolkata office. About Our Company | Accenture Experience: 4+ years in data science Educational Qualification: B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies