Home
Jobs
Companies
Resume

309 Collibra Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Kanayannur, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc) Data Mining (e.g., Microsoft SQL Server, etc) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree 1-3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Advanced Energy Advanced Energy Industries, Inc. (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers. Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing. We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide. Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer with 5-10 Years of Experience to join our Data and Analytics team. As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions. The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory , and Google Cloud, along with strong data warehousing skills from data ingestion to reporting. Familiarity with the manufacturing and supply chain domains is highly desirable. Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview . The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ , etc. Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud. Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Ensure data integrity, quality, and security across all data platforms. Provide expertise in data engineering, data product, and data platform concepts. Implement data mesh principles and medallion architecture to build scalable data platforms. Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Implement data quality practices using tools like Great Expectations, Deequ, etc. Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements. Develop and maintain documentation for data solutions, data flows, and data models. Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud. Strong data warehousing skills, including ETL processes, data modelling, and reporting. Familiarity with manufacturing and supply chain domains. Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture. Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview. Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Master's degree in a related field. Experience with cloud-based data platforms and tools. Certification in Databricks, Azure, or Google Cloud. As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce. Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate. Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities. We are committed to protecting and respecting your privacy. We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No. 10173 also known as the Data Privacy Act of 2012 Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Market Risk Data Governance , AVP Location: Mumbai, India Corporate Title: AVP Role Description Market & Valuation Risk Management (MVRM) provides an independent view of market risks to Deutsche Bank’s senior management and manages Deutsche Bank’s Market Risk position in an independent and neutral way. The Market Risk Analysis and Control (MRAC) function is responsible for the provision of all official market risk metrics and core analysis in support of risk management decision making, on behalf of the Market Risk Management department. The team has a global presence with staff located in London, New York, Berlin, Singapore, Mumbai and Pune. This role is within the Market risk team in Mumbai, supporting data quality initiatives in the Data Quality and Governance team which is responsible for data governance, specifically ensuring BCBS 239 compliance for existing and new processes, Data management initiatives, automation of current manual processes, analysing and implementing governance processes for any changes in the production processes, or policies and support the compliance with BCBS239 regulation. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Ensuring BCBS compliant status for all process at Market Risk Driving accurate and timely completion of recertifications (Compliance standards, Process Modelling, Data management artefacts, Stress Crisis Protocols, Lineage) Performing Annual BCBS 239 Self-Assessment for Market Risk metrics across Legal Entities Analysing KPI trends, defining remediations for non-green trends, presentation of the information to management Logging of Data quality issues and tracking to remediation Liaising with multiple teams, both internal and external to identify changes required in the governance processes for any changes/updates in the metrics’ production process, ensuring, compliance with the RDARR framework. Assessment and documentation of tangible benefits from the change process. Evaluating production and governance processes, driving rationalization and automation. Identification of gaps in the current processes and ensure fixes are implemented. Drive to automate manual processes in the governance framework Preparation of the plan and tracking to ensure for efficient and effective execution of the changes. Present regular updates Data Management Collaborate with cross-functional teams to promote data stewardship Understand and implement Core Data Standards Assist in the documentation and maintenance of data dictionaries and metadata repositories for Market Risk Ensure Data management artefacts are documented and updated. Perform Data analysis: investigate and present details of lineage, completeness, and transformations via flows and processes Compile reports Implementing the governance fora including scheduling of meetings, preparation of decks for meetings, taking minutes and following up for open actions Ad hoc reporting to support management requests Ensuring governance documentation (policies, DTP, etc) are updated regularly. Your Skills And Experience University degree and appropriate professional experience. Experience of working with Market Risk either from a Data Management, Risk data aggregation or risk reporting perspective. A strong understanding of the regulatory environment, frameworks and compliance requirements associated with financial services. Excellent knowledge of analysis and communication tools Excellent data analytical and problem-solving skills. Excellent communication and interpersonal skills for collaboration with stakeholders. Ability to work independently and manage multiple projects simultaneously and deliver high quality results under tight deadlines Expected to have experience working with BCBS 239, data lineage, and upstream data providers. Experience in establishing governance frameworks for effective processes & performance oversight. Experience in the usage of Industry standard data management tools such as Sparx, Collibra and Solidatus is preferable Experience in translating Core Data Standards into practical implementation How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Overview We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. About The Role As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. Key Responsibilities Design and implement modern data architectures using Databricks Lakehouse platform Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems Develop data engineering frameworks and reusable components to accelerate delivery Establish CI/CD pipelines and infrastructure-as-code practices for data solutions Implement data catalog solutions and governance frameworks Create technical specifications and architecture documentation Provide technical leadership to data engineering teams Collaborate with cross-functional teams to ensure alignment of data solutions Evaluate and recommend technologies, tools, and approaches for data initiatives Ensure data architectures meet security, compliance, and performance requirements Mentor junior team members on data architecture best practices Stay current with emerging technologies and industry trends Qualifications Extensive experience in data architecture design and implementation Strong software engineering background with expertise in Python or Scala Proven experience building data engineering frameworks and reusable components Experience implementing CI/CD pipelines for data solutions Expertise in infrastructure-as-code and automation Experience implementing data catalog solutions and governance frameworks Deep understanding of Databricks platform and Lakehouse architecture Experience migrating workloads from legacy systems to modern data platforms Strong knowledge of healthcare data requirements and regulations Experience with cloud platforms (AWS, Azure, GCP) and their data services Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Technical Skills Programming languages: Python and/or Scala (required) Data processing frameworks: Apache Spark, Delta Lake CI/CD tools: Jenkins, GitHub Actions, Azure DevOps Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi Data catalog tools: Databricks Unity Catalog, Collibra, Alation Data governance frameworks and methodologies Data modeling and design patterns API design and development Cloud platforms: AWS, Azure, GCP Container technologies: Docker, Kubernetes Version control systems: Git SQL and NoSQL databases Data quality and testing frameworks Optional - Healthcare Industry Knowledge Healthcare data standards (HL7, FHIR, etc.) Clinical and operational data models Healthcare interoperability requirements Healthcare analytics use cases About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Overview We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. About The Role As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. Key Responsibilities Design and implement modern data architectures using Databricks Lakehouse platform Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems Develop data engineering frameworks and reusable components to accelerate delivery Establish CI/CD pipelines and infrastructure-as-code practices for data solutions Implement data catalog solutions and governance frameworks Create technical specifications and architecture documentation Provide technical leadership to data engineering teams Collaborate with cross-functional teams to ensure alignment of data solutions Evaluate and recommend technologies, tools, and approaches for data initiatives Ensure data architectures meet security, compliance, and performance requirements Mentor junior team members on data architecture best practices Stay current with emerging technologies and industry trends Qualifications Extensive experience in data architecture design and implementation Strong software engineering background with expertise in Python or Scala Proven experience building data engineering frameworks and reusable components Experience implementing CI/CD pipelines for data solutions Expertise in infrastructure-as-code and automation Experience implementing data catalog solutions and governance frameworks Deep understanding of Databricks platform and Lakehouse architecture Experience migrating workloads from legacy systems to modern data platforms Strong knowledge of healthcare data requirements and regulations Experience with cloud platforms (AWS, Azure, GCP) and their data services Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred Technical Skills Programming languages: Python and/or Scala (required) Data processing frameworks: Apache Spark, Delta Lake CI/CD tools: Jenkins, GitHub Actions, Azure DevOps Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi Data catalog tools: Databricks Unity Catalog, Collibra, Alation Data governance frameworks and methodologies Data modeling and design patterns API design and development Cloud platforms: AWS, Azure, GCP Container technologies: Docker, Kubernetes Version control systems: Git SQL and NoSQL databases Data quality and testing frameworks Optional - Healthcare Industry Knowledge Healthcare data standards (HL7, FHIR, etc.) Clinical and operational data models Healthcare interoperability requirements Healthcare analytics use cases About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Description At Western Digital, our vision is to power global innovation and push the boundaries of technology to make what you thought was once impossible, possible. At our core, Western Digital is a company of problem solvers. People achieve extraordinary things given the right technology. For decades, we’ve been doing just that. Our technology helped people put a man on the moon. We are a key partner to some of the largest and highest growth organizations in the world. From energizing the most competitive gaming platforms, to enabling systems to make cities safer and cars smarter and more connected, to powering the data centers behind many of the world’s biggest companies and public cloud, Western Digital is fueling a brighter, smarter future. Binge-watch any shows, use social media or shop online lately? You’ll find Western Digital supporting the storage infrastructure behind many of these platforms. And, that flash memory card that captures and preserves your most precious moments? That’s us, too. We offer an expansive portfolio of technologies, storage devices and platforms for business and consumers alike. Our data-centric solutions are comprised of the Western Digital®, G-Technology™, SanDisk® and WD® brands. Today’s exceptional challenges require your unique skills. It’s You & Western Digital. Together, we’re the next BIG thing in data. Job Description ESSENTIAL DUTIES AND RESPONSIBILITIES Gathering data from diverse sources, including databases, APIs, and web scraping. Possessing deep knowledge of data analytics principles, tools, and technologies. Handling missing values, correcting errors, and ensuring data consistency, data quality, and optimizing data performance. Create and maintain data models that structure and organize data within a domain, ensuring clarity and consistency. Create data products, designed to solve specific business problems within a domain. Performing statistical analysis and modeling to identify trends, patterns, and relationships in the data. Preparing reports, presentations, and dashboards to communicate insights and findings. Using data to identify and solve business problems, improve processes, and make data-driven decisions. Collaborating with cross-functional teams including, Data Engineers, Data scientists, Business Analysts, Solution Architects, IT & Business teams. Document and maintain end to end data flows, Data Lineage, Data Catalog for various data marts. Be a liaison between solution architects, BSA’s and data engineers to ensure compliance to standards of Data integrations, data management and review the data solutions. Stay updated with the latest industry trends and best practices, sharing knowledge and encourage team to continuously improve their skill’s Qualifications REQUIRED Bachelor’s degree or higher in Computer Science or Engineering or related field Minimum 8+ years of experience working with diverse data platform technologies. Experience on any of the data Quality tools like Informatica Data Quality, Atlan, Collibra, etc. Demonstrable experience working with data structures coming from variety of ERP, CRM and other data sources. Experience working with at least one major cloud data platforms like AWS, Azure, Google, etc. Experience working with at least one modern Lakehouse platforms like Databricks, Snowflake, etc. Experience working with Tableau, Power BI, or other Data visualization tools. Develop Entity Relationship Diagrams using and data modeling tools. Functional domain knowledge in Finance, Supply chain with manufacturing background. Responsible for metadata management of data domain, inclusive of data definitions, data catalog, data lineage and documentation of data flow for critical processes, Sox compliance Partner with data governance team and business analysts Maintain in-depth understanding of business functions, processes, and relationships as it relates to data Skills Proven experience in data analysis, business intelligence, or related roles Proficiency in data analysis tools and programming languages (e.g., SQL, Python, R, etc.). Proficiency in statistical analysis techniques and tools. Knowledge of machine learning and data mining techniques is desirable. Ability to analyze, communicate and solve complex problems. Additional Information Western Digital is committed to providing equal opportunities to all applicants and employees and will not discriminate based on their race, color, ancestry, religion (including religious dress and grooming standards), sex (including pregnancy, childbirth or related medical conditions, breastfeeding or related medical conditions), gender (including a person’s gender identity, gender expression, and gender-related appearance and behavior, whether or not stereotypically associated with the person’s assigned sex at birth), age, national origin, sexual orientation, medical condition, marital status (including domestic partnership status), physical disability, mental disability, medical condition, genetic information, protected medical and family care leave, Civil Air Patrol status, military and veteran status, or other legally protected characteristics. We also prohibit harassment of any individual on any of the characteristics listed above. Our non-discrimination policy applies to all aspects of employment. We comply with the laws and regulations set forth in the Equal Employment Opportunity is the Law poster. Western Digital thrives on the power and potential of diversity. As a global company, we believe the most effective way to embrace the diversity of our customers and communities is to mirror it from within. We believe the fusion of various perspectives results in the best outcomes for our employees, our company, our customers, and the world around us. We are committed to an inclusive environment where every individual can thrive through a sense of belonging, respect and contribution. Western Digital is committed to offering opportunities to applicants with disabilities and ensuring all candidates can successfully navigate our careers website and our hiring process. Please contact us at jobs.accommodations@wdc.com to advise us of your accommodation request. In your email, please include a description of the specific accommodation you are requesting as well as the job title and requisition number of the position for which you are applying. Based on our experience, we anticipate that the application deadline will be 10/25/2024 , although we reserve the right to close the application process sooner if we hire an applicant for this position before the application deadline. If we are not able to hire someone from this role before the application deadline, we will update this posting with a new anticipated application deadline. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Collibra Data Governance Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : Bachelor in Computer Science Summary: As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models using Collibra Data Governance. Key Responsibilities: Knowledge of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra Good communication. Design of Data Governance Organization including steering committee, data governance office, stewardship layer and other working groups. Setup people and processes including relevant roles, responsibilities and controls, data ownership, workflows and common processes. Technical Experience Experience in Data Governance of wide variety of data types (structured, semi-structured and unstructured data) and wide variety of data sources (HDFS, S3, Kafka, Cassandra, Hive, HBase, Elastic Search) Working experience of Collibra operating model, workflow BPMN development, and how to integrate various applications or systems with Collibra. Experience in setting up people s roles, responsibilities and controls, data ownership, workflows and common processes. Integrate Collibra with other enterprise tools: Data Quality Tool, Data Catalog Tool, Master Data Management Solutions Develop and configure all Collibra customized workflows Develop API (REST, SOAP) to expose the metadata functionalities to the end-users. Professional Experience Working as SME in data governance, metadata management and data catalog solutions, specifically on Collibra Data Governance. Client interface and consulting skills required. Partner with Data Stewards for requirements, integrations and processes, participate in meetings and working sessions. Partner with Data Management and integration leads to continuously improve Data Management technologies and processes. Bachelor in Computer Science Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The IT Business Lead Analyst is a senior-level position responsible for liaising between business users and technologists to exchange information in a concise, logical and understandable way in coordination with the Technology team. The overall objective of this role is to contribute to continuous iterative exploration and investigation of business performance and other measures to gain insight and drive business planning. Responsibilities: Provide input during development and implementation phases, including formulation and definition of systems scope, objectives and necessary system enhancements for complex, high-impact projects Identify and communicate risks and impacts and propose risk mitigation options, considering business implications of the application of technology to the current business environment Consult with business clients to determine system functional specifications and partner with multiple management teams and other units to meet organizational objectives Evaluate new IT developments and evolving business requirements and recommend appropriate systems alternatives and/or enhancements to current systems by analyzing business processes, systems and industry standards Provide in-depth and sophisticated analyses with interpretive thinking to define problems, develop innovative solutions and influence strategic functional decisions Supervise day-to-day staff management issues, including resource management, work allocation, mentoring/coaching and other duties and functions as assigned Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 10-15 years of experience Proficiency in MS Office (Word, Excel, Visio, PowerPoint) with extensive experience using Excel for data analysis Experience with all phases of Software Development Life Cycle Comprehensive knowledge of the principles of business analysis Education: Bachelor's degree/University degree or equivalent experience Master's degree preferred This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. Position Overview: We are seeking an experienced and dynamic IT Business Lead Analyst (Vice President) to lead and manage initiatives related to Data Governance and Control Codification projects. The ideal candidate should have a strong understanding of data quality , control frameworks , and codification processes , along with extensive knowledge of the banking and finance domain . This role requires a blend of technical expertise, business acumen, and leadership skills to ensure the successful delivery of data governance initiatives. Key Responsibilities: Lead and manage Data Governance and Control Codification projects, ensuring alignment with organizational goals and regulatory requirements. Define and implement data quality frameworks, standards, and processes to ensure the accuracy, consistency, and reliability of data. Collaborate with cross-functional teams to identify, document, and codify controls for critical data elements. Work closely with stakeholders to understand business requirements and translate them into actionable technical solutions. Ensure compliance with data governance policies, regulatory standards, and industry best practices. Drive the adoption of data governance tools and technologies to enhance data quality and control processes. Provide subject matter expertise in banking and finance, ensuring that data governance initiatives align with industry-specific requirements. Monitor and report on the effectiveness of data governance and control frameworks, identifying areas for improvement. Mentor and guide team members, fostering a culture of accountability and continuous improvement. Required Skills and Qualifications: 10+ years of experience in IT Business Analysis, with a focus on Data Governance, Data Quality, and Control Codification. Strong understanding of data quality frameworks, data lineage, and metadata management. Experience in the banking and finance domain, with knowledge of regulatory requirements and industry standards. Proficiency in data governance tools (e.g., Collibra, Informatica, or similar) and data quality tools. Strong analytical and problem-solving skills, with the ability to work with large datasets and complex systems. Excellent communication and stakeholder management skills, with the ability to bridge the gap between technical and business teams. Bachelor's or Master's degree in Computer Science, Information Systems, Finance, or a related field. Preferred Qualifications: Experience with control frameworks such as COSO, COBIT, or similar. Knowledge of data privacy regulations (e.g., GDPR, CCPA) and their impact on data governance. Familiarity with data visualization tools (e.g., Tableau, Power BI) for reporting and analysis. Certifications in data governance or related fields (e.g., CDMP, DGSP). ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Business Analysis / Client Services ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Overview KKR is a leading global investment firm that offers alternative asset management as well as capital markets and insurance solutions. KKR aims to generate attractive investment returns by following a patient and disciplined investment approach, employing world-class people, and supporting growth in its portfolio companies and communities. KKR sponsors investment funds that invest in private equity, credit and real assets and has strategic partners that manage hedge funds. KKR’s insurance subsidiaries offer retirement, life and reinsurance products under the management of Global Atlantic Financial Group. References to KKR’s investments may include the activities of its sponsored funds and insurance subsidiaries. KKR's Gurugram office provides best in class services and solutions to our internal stakeholders and clients, drive organization wide process efficiency and transformation, and reflect KKR's global culture and values of teamwork and innovation. The office will contain multifunctional business capabilities and will be integral in furthering the growth and transformation of KKR. Team Overview KKR has built out an enterprise Data Operations group to collect, manage, and harness the power of data across our diverse business activities. The Data Operation Centre of Excellence (CoE) is a cross functional team dedicated to formulating & driving KKR’s enterprise data management strategy while also providing the operation leverage required to bring these strategies/frameworks to life. The Data Ops CoE consists of 2 focus areas (pillars); Data Management and Data Delivery. Position Overview The role is responsible for executing data management processes aimed at ensuring clean and quality data in the KKR data ecosystem. They will be part of KKR’s enterprise data group which collects, manages, and harnesses the power of data across our diverse portfolio investments. They will work collaboratively across the firm to set standards & best practices for data management while providing the operating leverage to centrally support the roll-out/ execution of these frameworks Roles & Responsibilities Operational Excellence Develop specifications as well as testing and enhancing tools/applications in conjunction with the IT team to maintain complete, accurate and up to date data Maintain consistent, accurate and complete data within KKR’s data ecosystem Implement data quality controls leveraging industry best tools i.e. Collibra Create and maintain data quality reporting functionality as per business needs Ensure data governance practices and activities are embedded across business units Execute and manage ad hoc data related projects within specified deadlines Collibra workflow development and maintenance Stakeholder Management Collaborate with engineering and IT to support and make recommendations for enhanced digital reporting capabilities and automated data reconciliation Communicate and work closely with relevant teams to close data gaps in a clear and timely manner Serve as point of contact for data-related questions and updates from various internal and external groups, delivering ad-hoc analytics to answer key business questions in a timely manner Reporting & Governance Design and document standard operating procedures for data management Implement and own best in class data governance practices; ensuring that data is well defined & transparently documented Qualifications Bachelor’s Degree or equivalent work experience required 2-4 years of data operation experience in financial services Experience in a multinational Financial Services organization and/or Private Equity preferred Ability to manage standard reports, templates & dashboards Ability to validate and review data Ability to provide support for internal stakeholders by sending reminders of emails, filling timesheets, collecting information as per service requests Ability to adhere to the compliance requirements of processes Ability to develop and enhance data protection and management tools or applications Ability to design and execute data management focusing on data governance and data quality activities. Experience of using tool like Collibra is a must. Systems/ Tools/ Application knowledge: Experience with process design and process enhancement Proficiency in data operations and data management Advanced proficiency in Excel Skills in a BI tool such as Power BI Advanced SQL skills Experience with Python is a plus Displays high attention to detail Demonstrates outstanding initiative and strong work ethic Focuses on delivery excellence and accountability Displays team-work orientation and is highly collaborative Displays strong integrity and professionalism Builds strong relationships with local and global colleagues Demonstrates strong track record in accuracy and organization Demonstrates excellent written, verbal, and interpersonal communication skills KKR is an equal opportunity employer. Individuals seeking employment are considered without regard to race, color, religion, national origin, age, sex, marital status, ancestry, physical or mental disability, veteran status, sexual orientation, or any other category protected by applicable law. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Quality Lead Department: Data Governance / IT Location: Pune / Bangalore Experience: 6 -8 yrs Notice period: 30 days Key Responsibilities: Lead the development and implementation of the enterprise-wide Data Quality Define and monitor key data quality metrics across various business domains. Collaborate with IT and data governance teams to establish and enforce data governance policies and frameworks. Conduct regular data quality assessments to identify gaps and areas for improvement. Implement data cleansing, validation, and enrichment processes to enhance data accuracy and reliability. Preferred Skills: Experience with tools like Informatica, Talend, Collibra, or similar. Familiarity with regulatory requirements Certification in Data Management or Data Governance. Show more Show less

Posted 1 week ago

Apply

130.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Northern Trust Northern Trust, a Fortune 500 company, is a globally recognized, award-winning financial institution that has been in continuous operation since 1889. Northern Trust is proud to provide innovative financial services and guidance to the world’s most successful individuals, families, and institutions by remaining true to our enduring principles of service, expertise, and integrity. With more than 130 years of financial experience and over 22,000 partners, we serve the world’s most sophisticated clients using leading technology and exceptional service. Key Responsibilities ER/Studio Support & Administration Provide day-to-day support for the ER/Studio platform, including troubleshooting issues and resolving system errors. Maintain and upgrade the ER/Studio environment, ensuring compatibility with database systems and tools. Monitor system performance and address any bottlenecks or downtime. Configuration & Customization Configure ER/Studio repositories and user roles to meet organizational requirements. Customize templates, metadata, and data dictionaries within ER/Studio. Create and maintain naming standards and business glossaries. Development & Integration Design and implement logical and physical data models using ER/Studio. Integrate ER/Studio with other data tools, such as Collibra, Snowflake, or SQL-based platforms. Collaboration & Documentation Collaborate with data architects, business analysts, and development teams to gather requirements. Document all system changes, configurations, and data models. Train and support end-users in utilizing ER/Studio features effectively. Technical Skills And Requirements Experience with ER/Studio Hands-on experience with ER/Studio Enterprise Team Edition or equivalent. Proficiency in data modeling (conceptual, logical, and physical) using ER/Studio. Knowledge of repository management and collaboration features in ER/Studio. Database Knowledge Strong understanding of RDBMS (e.g., SQL Server, Oracle, PostgreSQL, or Snowflake) Familiarity with SQL scripting for queries and optimization. Metadata Management Experience managing metadata, data dictionaries, and glossaries within ER/Studio. Ability to map metadata between ER/Studio and external systems or tools. Integration and Scripting Knowledge of integrating ER/Studio with BI tools, data governance platforms (e.g., Collibra), or ETL tools. Familiarity with APIs and scripting languages (e.g., Python) for automation. Version Control & Repository Management Experience managing version control and branching for data models in ER/Studio. Understanding of team collaboration and repository synchronization. Soft Skills Strong problem-solving skills and attention to detail. Ability to communicate complex technical concepts to non-technical stakeholders. Collaborative mindset and ability to work in a team-oriented environment. Working With Us As a Northern Trust partner, greater achievements await. You will be part of a flexible and collaborative work culture in an organization where financial strength and stability is an asset that emboldens us to explore new ideas. Movement within the organization is encouraged, senior leaders are accessible, and you can take pride in working for a company committed to assisting the communities we serve! Join a workplace with a greater purpose. We’d love to learn more about how your interests and experience could be a fit with one of the world’s most admired and sustainable companies! Build your career with us and apply today. #MadeForGreater Reasonable accommodation Northern Trust is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation for any part of the employment process, please email our HR Service Center at MyHRHelp@ntrs.com. We hope you’re excited about the role and the opportunity to work with us. We value an inclusive workplace and understand flexibility means different things to different people. Apply today and talk to us about your flexible working requirements and together we can achieve greater. Show more Show less

Posted 1 week ago

Apply

7.0 - 12.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Job Req Number: 63252 Time Type: Full Time Are you ready to develop your career in a rapidly growing and successful global company acting in a fast-paced role? If you are someone looking for a position that challenges and inspire you in a successful international company, DSV is your place. The DSV group, headquartered in Denmark, is one of the biggest transports and logistics companies in the world, with more than 75,000 dedicated employees, operating in more than 90 countries. You will join a global and determined group driven forward by the desire to grow and will be part of a dynamic group, which is characterized by high level professionalization and constant improvement. An international team of specialists Join DSV Global IT Commercial Applications & Integrations based out in our Delhi office, in India. Here, you will join our Solutions IT, Global IT team. Roles & Responsibilities: Work in collaboration with geographically distributed teams. Communicate effectively providing software design feedback. Code to implement new technologies and solutions, functionalities, fix defects, etcetera following an Agile SDLC. Write clean, elegant, reusable code following advanced principles such as Behavior, Test Driven Development (BDD, TDD). Assist operational team to diagnose functional and technical incidents on daily basis. Independently work on assignments and prompt initiatives that improve user experiences. Should be able to think out of the box, innovate and contribute towards project success. Technical Requirements: Experience developing Collibra workflows, UI and working with the Collibra SDK. Familiarity building and customizing Off-The-Shelf (OTS) Data Catalogue/Metadata Repository solutions. Experience in linux based software development environments importing vendor-based SDK libraries and other dependencies. Java/JavaScript/Node.js experience. Practical experience developing Docker containers for deployment to Kubernetes cloud environments. Well versed in data modeling techniques and schemas, XSD, DFDL others. Experience with database connectivity, drivers, security, connection pooling, etcetera. Understand and write SQL queries and stored procedures with debugging and execution planning. Experience in object-oriented programming following SOLID design principles. Knowledge regarding CDC and other data gathering techniques. Must have experience building and inline consumption of services using REST. Familiarity with Agile methodologies and CI/CD practices. Experience with tools such as: Visual Studio, Visual Studio Code, SQL Server Management Studio, JIRA, Confluence, Microsoft Teams, Azure DevOps, GIT, BitBucket. Behavior Skill: You are passionate about development, self-driven, highly motivated. You care about the customer experience and have experience building enterprise/customer facing applications. You are extremely collaborative and enjoy working with team members across the globe. You are proactive, fast learner and enjoy problem solving. You have excellent written and verbal communication skills. Required Experience: Ideally having 7 to 12 years of total experience as software developer Location : New Delhi We offer you: The opportunity to expand your experience in a truly international, world-class company whose philosophy is that your everyday work should be both varied and full of professional challenges, with wide opportunities for constant professional and personal development. Additionally, we offer the following: Medical Insurance including family and Parents up to 10 Lakh INR per year. 2.5 days earned leave per month which is 30 days in a calendar year & 10 days Sick Leave in a year. Free parking for 4-wheeler or 2-wheeler vehicle. Personal Mobile and Internet expenses reimbursed per calendar month subject to maximum cap. Want to know more and apply? We will be happy to answer any questions you may have regarding the position and about your options in DSV. You are welcome to send an email to our recruitment team HR Menka Pundir at Menka.Pundir@in.dsv.com . DSV – Global Transport and Logistics DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of more than 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms. We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career. Visit dsv.com and follow us on LinkedIn, Facebook and Twitter. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderābād

Remote

ABOUT TIDE At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. ABOUT THE ROLE: As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/ Fivetran, DBT, Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer you'll be: Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a "go-to" expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other "delta loading" approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review Building Looker Dashboard for use cases if required WHAT WE ARE LOOKING FOR: You have 3+ years of extensive development experience using snowflake or similar data warehouse technology You have working experience with DBT and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, Git ,Looker You have experience in agile processes, such as SCRUM You have extensive experience in writing advanced SQL statements and performance tuning them You have experience in Data Ingestion techniques using custom or SAAS tool like Fivetran You have experience in data modelling and can optimize existing/new data models You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets You have experience architecting analytical databases (in Data Mesh architecture) is added advantage You have experience working in agile cross-functional delivery team You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment You have strong technical documentation skills and the ability to be clear and precise with business users You have business-level of English and good communication skills You have basic understanding of various systems across the AWS platform ( Good to have ) Preferably, you have worked in a digitally native company, ideally fintech Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage Our Tech Stack: DBT Snowflake Airflow Fivetran SQL Looker WHAT YOU'LL GET IN RETURN: Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you'll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options TIDEAN WAYS OF WORKING: At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. #LI-NN1 TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members' diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone's voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone's voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice .

Posted 1 week ago

Apply

3.0 years

0 Lacs

Delhi

Remote

ABOUT TIDE At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. ABOUT THE ROLE: As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/ Fivetran, DBT, Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer you'll be: Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a "go-to" expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other "delta loading" approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review Building Looker Dashboard for use cases if required WHAT WE ARE LOOKING FOR: You have 3+ years of extensive development experience using snowflake or similar data warehouse technology You have working experience with DBT and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, Git ,Looker You have experience in agile processes, such as SCRUM You have extensive experience in writing advanced SQL statements and performance tuning them You have experience in Data Ingestion techniques using custom or SAAS tool like Fivetran You have experience in data modelling and can optimize existing/new data models You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets You have experience architecting analytical databases (in Data Mesh architecture) is added advantage You have experience working in agile cross-functional delivery team You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment You have strong technical documentation skills and the ability to be clear and precise with business users You have business-level of English and good communication skills You have basic understanding of various systems across the AWS platform ( Good to have ) Preferably, you have worked in a digitally native company, ideally fintech Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage Our Tech Stack: DBT Snowflake Airflow Fivetran SQL Looker WHAT YOU'LL GET IN RETURN: Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you'll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options TIDEAN WAYS OF WORKING: At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. #LI-NN1 TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members' diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone's voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone's voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice .

Posted 1 week ago

Apply

8.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY- GDS - Data and Analytics – Full stack Developer – Manager Job Description: We are seeking a highly skilled and motivated Full stack Developer with Data marketplace experience to design, develop, and manage enterprise-grade data marketplaces. The ideal candidate will have strong programming skills, experience with APIs, and a deep understanding of cloud platforms such as Azure, AWS, and Snowflake. Key Responsibilities: Design and develop data marketplace solutions to enable secure data sharing and consumption. Build and maintain RESTful APIs for data access, integration, and governance. Develop backend components and data pipelines using Java and Python. Integrate marketplace platforms with cloud environments (Azure, AWS) and data warehousing tools like Snowflake. Design and implement database schemas and manage data storage solutions (e.g., SQL, NoSQL). Implement data cataloguing, metadata management, and user access controls. Collaborate with data engineers, architects, and business stakeholders to understand requirements and deliver scalable solutions. Ensure performance optimization, security compliance, and maintainability of the platform. Required Qualifications: 8-11years of experience in software development, preferably in data-intensive environments. Proficient in Java and Python. Strong experience in building and integrating APIs (REST/GraphQL). Hands-on experience with at least one cloud platform (Azure or AWS). Working knowledge of Snowflake or other cloud data warehouses. Experience with CI/CD pipelines, containerization (Docker/Kubernetes) is a plus. Strong problem-solving and communication skills. Preferred Qualifications: Experience with data catalog tools (e.g., Alation, Collibra) or marketplace platforms. Understanding of data governance, security, and compliance practices. Familiarity with Agile methodologies. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 week ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Team As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. About The Role As a Staff Data Engineer you’ll be: Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a “go-to” expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review Building Looker Dashboard for use cases if required What We Are Looking For You have 9+ years of extensive development experience using snowflake or similar data warehouse technology You have working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker You have experience in agile processes, such as SCRUM You have extensive experience in writing advanced SQL statements and performance tuning them You have experience in Data Ingestion techniques using custom or SAAS tool like fivetran You have experience in data modelling and can optimise existing/new data models You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets You have having experience architecting analytical databases (in Data Mesh architecture) is added advantage You have experience working in agile cross-functional delivery team You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment You have strong technical documentation skills and the ability to be clear and precise with business users You have business-level of English and good communication skills You have basic understanding of various systems across the AWS platform ( Good to have ) Preferably, you have worked in a digitally native company, ideally fintech You have experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage Our Tech Stack DBT Snowflake Airflow Fivetran SQL Looker What You Will Get In Return Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options Tidean Ways Of Working At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice . Show more Show less

Posted 1 week ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Senior Manager, Tech Lead The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Join our company as we transform and innovate. We are at the forefront of research to deliver innovative health solutions that advance the prevention and treatment of diseases in people and animals. We are currently seeking a Technology Engineer to help deliver our Data and Analytics Platform product. This is also an exciting opportunity to contribute to the development of our broader company's Data and Analytics practice inside our team. What Will You Do In This Role Contributes to architecture, design, and engineering of Global Support Functions Data Engineering, Data Integration and Data Visualization service. Contributes to architecture, design, and engineering of global data engineering/integration/visualization services. Defines best practices and guidelines for delivery of Analytic solutions. Consumes practices from and collaborates to improve practices amongst our emerging engineering community. Contributes to identifying capability gaps in product capabilities and designing solutions to address it. Executes on opportunities to automate and simplify maintenance and lifecycle of platform services. Maintains current industry knowledge of cloud native concepts, best practices, and technologies. Prioritizes workloads, commitments, and scheduled timelines. Frequent interaction with product managers and engineering teams to onboard their new delivery to our central platforms. Documents, reviews, and ensures that all System Development Lifecycle (SDLC) and company policy standards are met. Provide point of escalation for product customer support team. What Should You Have BS Degree or equivalent in Computer Science, Computer Engineering, Information Systems, or equivalent experience. Required Relevant certification or completion of equivalent program in areas such as Software Development, Computer Science, or Computer Engineering. Hands on experience with various data engineering/integration/BI platforms (e.g. AWS Glue, Athena, S3 (storage), Apache Airflow, Redshift, Snowflake, Databricks, Collibra) Understanding of Cloud Service Providers (e.g. AWS, Azure, etc.) Understanding of web and network protocols such as HTTP/S, TCP/IP, DNS. Understanding of basic routing concepts. Experience with scripting language such as Python or Unix shell scripts with strong focus towards automation. Experience with software solution design and documentation. Strong knowledge and experience in IT, specifically in designing, developing, modifying, and implementing solutions. Proficiency in working with both new and existing applications, systems architecture, and network systems. Ability to review and understand system requirements and business processes. Proficient in coding, testing, debugging, and implementing software solutions. Expertise in the engineering, delivery, and management of Cloud solutions, including Cloud platform and Cloud native services. Experience in monitoring the consumption of Cloud resources and managing application performance. Ability to oversee request fulfillment turnaround efficiently. Strong understanding of maintaining system security posture. Strong leadership skills including but not limited to strategic planning, entrepreneurship, innovation, and business savviness. Strong commitment to diversity, equity, and inclusion, and have the ability to influence and motivate others. Excellent emotional intelligence, decision-making skills, and a strong sense of ownership and accountability. Preferred 6 to 8 years of experience in IT field and/or related program. Ability to work in a matrixed and highly concurrent environment. Demonstrated ability to plan and execute on a project or experiment, including milestones and endpoints. Experience working as part a global, diverse team. Experience using, implementing and/or operating Data Warehousing or broad range analytic solutions. Experience with Amazon Web Services such as VPC, Route53, EC2, ALB, S3, RDS, S3, IAM, etc. Relevant certification (e.g. AWS, Azure, etc.) Experience debugging software and/or scripting errors. Experience with Go programming language. Experience with infrastructure, network, database, or security troubleshooting. Experience delivering products and features using Agile/Scrum methodologies. Experience with DevOps tools such as git, Terraform, Jira, Jenkins, CloudBees, GitHub Actions, etc. Experience in System Development Lifecycle (SDLC) documentation. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills Job Posting End Date 06/9/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R342324 Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Job Description To our technology center in Prague, alternatively to Hyderabad we are looking for a System Engineer/Architect . At our company, we aspire to be the premier research-intensive biopharmaceutical company. We're at the forefront of research to deliver innovative health solutions that advance the prevention and treatment of diseases in people and animals. Join our team and use the power of leading-edge science to save and improve lives around the world. Responsibilities Conduct architecture evaluations and collaborate with delivery squads to develop business solution architectures that fit the business needs Define and implement integrations to systems and solutions that meet business requirements Communicate with all stakeholders, including internal and external developers, end-users, product line/product owners and architects. Update stakeholders on the status of product development processes Establish information architecture standards and practices across capability areas (data collection, ontology creation, data mapping, taxonomies) Analyze data landscape across enterprise and assess health/risks of current state Design and implement AWS cloud solutions for genomics and chemistry research Develop and maintain data governance frameworks for our research & development division Collaborate with cross-functional teams to ensure data integrity and security Provide technical leadership and guidance on best practices for cloud architecture Analyze and optimize existing systems to improve performance and scalability Requirements (Education minimum requirements subject to change based on country.) Minimum bachelor’s degree in computer science or a related STEM (Science, technology, engineering, and mathematics) field Information and solution architecture / Requirements Management / System Engineering / Data Management / Data Science Proven experience with AWS cloud architecture Experience in data governance ideally in R&D data governance Excellent problem-solving and analytical skills Strong understanding of genomics and chemistry data Business Enterprise Architecture (BEA) Certifications in AWS or related technologies Experience with technologies like Collibra, python, R, R-Shiny Used to work in agile way of and using tools like GitHub, JIRA, Confluence Used to work remotely in global environments and different cultures Our Offer (Please note, the primary location is Czechia, benefits in other location may vary.) Exciting work in a great team, global projects, international environment Opportunity to learn and grow professionally within the company globally Hybrid working model, flexible role pattern Pension and health insurance contributions Internal reward system plus referral programme 5 weeks annual leave, 5 sick days, 15 days of certified sick leave paid above statutory requirements annually, 40 paid hours annually for volunteering activities, 12 weeks of parental contribution Cafeteria for tax free benefits according to your choice (meal vouchers, Lítačka, sport, culture, health, travel, etc.), Multisport Card Vodafone, Raiffeisen Bank, Foodora, and Mall.cz discount programmes Up-to-date laptop and iPhone Parking in the garage, showers, refreshments, library, music corner Competitive salary, incentive pay, and many more Ready to take up the challenge? Apply now! Know anybody who might be interested? Refer this job! Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Preferred Skills Database Design, Emerging Technologies, Hardware Design, Management System Development, Network Design, Radio Frequency Engineering, Real-Time Programming, Requirements Management, Software Development, Software Development Life Cycle (SDLC), Solution Architecture, System Designs, Systems Integration, Technical Advice, Testing Job Posting End Date 06/6/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R336867 Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

JOB_POSTING-3-70939 Job Description Role Title: Analyst, Data Sourcing – Metadata (L08) Company Overview : Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles Organizational Overview Our Analytics organization comprises of data analysts who focus on enabling strategies to enhance customer and partner experience and optimize business performance through data management and development of full stack descriptive to prescriptive analytics solutions using cutting edge technologies thereby enabling business growth. Role Summary/Purpose The Analyst, Data Sourcing - Metadata (Individual Contributor) role is located in the India Analytics Hub (IAH) as part of Synchrony’s enterprise Data Office. This role is responsible for supporting metadata management processes within Synchrony’s Public and Private cloud and on-prem environments within the Chief Data Office. This role focuses on assisting with metadata harvesting, maintaining data dictionaries, and supporting the tracking of data lineage. The analyst will collaborate closely with senior team members to ensure access to accurate, well-governed metadata for analytics and reporting. Key Responsibilities Implement and maintain metadata management processes across Synchrony’s Public and Private cloud and on-prem environments, ensuring accurate integration with technical and business Metadata catalogs. Work with the Data Architecture and Data Usage teams to track data lineage, traceability, and compliance, identifying and escalating metadata-related issues. Document technical specifications, support solution design, participate in agile development, and release cycles for metadata initiatives. Adhere to data management policies, track KPIs for Metadata effectiveness and assist in assessment of metadata risks to strengthen governance. Maintain stable operations, troubleshoot metadata and lineage issues, and contribute to continuous process improvements to improve data accessibility. Required Skills & Knowledge Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Minimum of 1 years’ experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problem-solving abilities, ability to work independently and manage multiple tasks, and attention to detail. Desired Skills & Knowledge AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Preferred Qualifications Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). Familiarity with Collibra Eligibility Criteria: Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Work Timings: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, LPP) L4 to L7 Employees who have completed 12 months in the organization and 12 months in current role and level are only eligible. L8 Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L04+ Employees can apply Grade/Level: 08 Job Family Group Information Technology Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

JOB_POSTING-3-70891 Job Description Role Title : Analyst, Analytics - Data Quality Developer(L08) Company Overview : Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles Organizational Overview Our Analytics organization comprises of data analysts who focus on enabling strategies to enhance customer and partner experience and optimize business performance through data management and development of full stack descriptive to prescriptive analytics solutions using cutting edge technologies thereby enabling business growth. Role Summary/Purpose The Analyst, Analytics - Data Quality Developer (Individual Contributor) role is located in the India Analytics Hub (IAH) as part of Synchrony’s enterprise Data Office. This role will be responsible for the proactive design, implementation, execution, and monitoring of Data Quality process capabilities within Synchrony’s Public and Private cloud and on-prem environments within the Chief Data Office. The Data Quality Developer – Analyst will work within the IT organization to support and participate in build and run activities and environment (e.g. DevOps) for Data Quality. Key Responsibilities Monitor and maintain Data Quality and Data Issue Management operating level agreements in support of data quality rule execution and reporting Assist in performing root cause analysis for data quality issues and data usage challenges, particularly for the workload migration to the public cloud. Recommend, design, implement and refine / remediate data quality specifications within Synchrony’s approved Data Quality platforms Participate in the solution design of data quality and data issue management technical and procedural solutions, including metric reporting Work closely with Technology teams and key stakeholders to ensure the data quality issues are prioritized, analyzed and addressed Regularly communicate the states of data quality issues and progress to key stakeholders Participate in the planning and execution of agile release cycles and iterations Qualifications/Requirements Minimum of 1 years’ experience in data quality management, including implementing data quality rules, data profiling and root cause analysis for data issues, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Minimum of 1 years’ experience with data quality or data integration tools such as Ab Initio, Informatica, Collibra, Stonebranch or Tableau, gained through hands-on experience or projects. Good communication and collaboration skills, strong analytical thinking and problem-solving abilities, ability to work independently and manage multiple tasks, and attention to detail. Desired Characteristics Broad understanding of banking, credit card, payment solutions, collections, marketing, risk and regulatory & compliance. Experience using data governance and data quality tools such as: Collibra, Ab Initio Express>IT; Ab Initio MetaHub. Proficient in writing / understanding SQL. Experience querying/analyzing data in cloud-based environments (e.g, AWS, Redshift) AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics – Specialty Intermediate to advanced MS Office Suite skills including Power Point, Excel, Access, Visio. Strong relationship management and influencing skills to build enduring and productive alliances across matrix organizations. Demonstrated success in managing multiple deliverables concurrently often within aggressive timeframes; ability to cope under time pressure. Experience in partnering with a diverse team composed of staff and consultants located in multiple locations and time zones. Eligibility Criteria: Bachelor’s Degree, preferably in Engineering or Computer Science with more than 1 years’ hands-on Data Management experience or in lieu of a degree with more than 3 years’ experience. Work Timings: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (Formal/Final Formal) or PIP L4 to L7 Employees who have completed 12 months in the organization and 12 months in their current role and level are eligible. L8+ Employees who have completed 18 months in the organization and 12 months in their current role and level are eligible. Grade/Level: 08 Job Family Group Information Technology Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies