Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
pune, maharashtra, india
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc. Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc.) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc.) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc.) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc.) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc.) Data Mining (e.g., Microsoft SQL Server, etc.) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree A minimum of 3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
noida, uttar pradesh, india
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc. Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc.) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc.) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc.) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc.) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc.) Data Mining (e.g., Microsoft SQL Server, etc.) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree A minimum of 3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
mumbai, maharashtra, india
Remote
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity The objective of our Digital Risk Consulting service is to support clients with the development, implementation, improvement, and modernization of their technology risk and compliance programs to address the constantly changing risk and technology landscape. Our solutions can be used by our clients to build confidence and trust with their customers, the overall market, and when required by regulation or contract. Your Key Responsibilities You will operate as a team leader for engagements to help our clients develop and strengthen their IT risk and compliance programs. You will work directly with clients to review their IT processes and controls, remediate and implement controls, onboard new tools and services into risk and compliance frameworks, and assist with the readiness and adherence for new compliance regulations. Your responsibilities include both in-person and remote oversight and coaching of engagement team members, reporting to both senior engagement team members and client leadership, as well as partnering with our key client contacts to complete the engagement work. What You'll Do Designing and implementing solutions to various data related technical/compliance challenges such as DevSecOps, data strategy, data governance, data risks & relevant controls, data testing, data architecture, data platforms, data solution implementation, data quality and data security to manage and mitigate risk. Leveraging data analytics tools/software to build robust and scalable solutions through data analysis and data visualizations using SQL, Python and visualization tools Design and implement comprehensive data analytics strategies to support business decision-making. Collect, clean, and interpret large datasets from multiple sources, ensuring completeness, accuracy and integrity of data. Integrating and/or piloting next-generation technologies such as cloud platforms, machine learning and Generative AI (GenAI) Developing custom scripts and algorithms to automate data processing and analysis to generate insights Applying business / domain knowledge including regulatory requirements and industry standards to solve complex data related challenges Analyzing data to uncover trends and generate insights that can inform business decisions Build and maintain relationships across Engineering, Product, Operations, Internal Audit, external audit and other external stakeholders to drive effective financial risk management. Work with DevSecOps, Security Assurance, Engineering, and Product teams to improve efficiency of control environments and provide risk management through implementation of automation and process improvement Bridge gaps between IT controls and business controls, including ITGCs and automated business controls. Work with IA to ensure complete control environment is managed Work with emerging products to understand risk profile and ensure an appropriate control environment is established Implement new process and controls in response to changes to the business environment, such as new product introduction, changes in accounting standards, internal process changes or reorganization. What You'll Need Experience in data architecture, data management, data engineering, data science or data analytics Experience in building analytical queries and dashboards using SQL, noSQL, Python etc. Proficient in SQL and quantitative analysis, you can deep dive into large amounts of data, draw meaningful insights, dissect business issues and draw actionable conclusions Knowledge of tools in the following areas: Scripting and Programming (e.g., Python, SQL, R, Java, Scala, etc.) Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout, etc.) Data Management (e.g., Informatica, Collibra, SAP, Oracle, IBM etc.) Predictive Analytics (e.g., Python, IBM SPSS, SAS Enterprise Miner, RPL, Matl, etc.) Data Visualization (e.g., Tableau, PowerBI, TIBCO-Spotfire, CliqView, SPSS, etc.) Data Mining (e.g., Microsoft SQL Server, etc.) Cloud Platforms (e.g., AWS, Azure, or Google Cloud) Ability to analyze complex processes to identify potential financial, operational, systems and compliance risks across major finance cycles Ability to assist management with the integration of security practices in the product development lifecycle (DevSecOps) Experience with homegrown applications in a microservices/dev-ops environment Experience with identifying potential security risks in platform environments and developing strategies to mitigate them Experience with SOX readiness assessments and control implementation Knowledge of DevOps practices, CI/CD pipelines, code management and automation tools (e.g., Jenkins, Git, Phab, Artifactory, SonarQube, Selenium, Fortify, Acunetix, Prisma Cloud) Preferred: Experience in: Managing technical data projects Leveraging data analytics tools/software to develop solutions and scripts Developing statistical model tools and techniques Developing and executing data governance frameworks or operating models Identifying data risks and designing and/or implementing appropriate controls Implementation of data quality process Developing data services and solutions in a cloud environment Designing data architecture Analyzing complex data sets & communicating findings effectively Process management experience, including process redesign and optimization Experience in scripting languages (e.g., Python, Bash) Experience in cloud platforms (e.g., AWS, Azure, GCP) and securing cloud-based applications/services To qualify for the role, you must have A bachelor's or master's degree A minimum of 3 years of experience working as an IT risk consultant or data analytics experience. Bring your experience in applying relevant technical knowledge in at least one of the following engagements: (a) risk consulting, (b) financial statement audits; (c) internal or operational audits, (d) IT compliance; and/or (e) Service Organization Controls Reporting engagements. We would expect for you to be available to travel outside of their assigned office location at least 50% of the time, plus commute within the region (where public transportation often is not available). Successful candidates must work in excess of standard hours when necessary. A valid passport is required. Ideally, you’ll also have A bachelor's or master's degree in business, computer science, information systems, informatics, computer engineering, accounting, or a related discipline CISA, CISSP, CISM, CPA or CA certification is desired; non-certified hires are required to become certified to be eligible for promotion to Manager. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
0.0 - 6.0 years
0 Lacs
saidapet, chennai, tamil nadu
On-site
Job Information Date Opened 09/08/2025 City Saidapet Country India Job Role Data Engineering State/Province Tamil Nadu Industry IT Services Job Type Full time Zip/Postal Code 600096 Job Description Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer, you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python, SQL, AWS, PySpark, and Databricks, you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark, Databricks, and SQL on AWS cloud platforms. Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark. Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3, Glue, EMR, Athena, Redshift, and Lambda. Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow, Databricks Jobs, or AWS Step Functions. Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability, monitoring, and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight. Work Environment & Collaboration: We value a hybrid, collaborative environment that encourages shared learning and innovation. You will work closely with product owners, architects, analysts, and data scientists across geographies to solve real-world business problems using cutting-edge technologies and methodologies. We encourage flexibility while maintaining a strong in-office presence for better team synergy and innovation. About Agilisium - Agilisium, is an AWS technology Advanced Consulting Partner that enables companies to accelerate their "Data-to-Insights-Leap. With $50+ million in annual revenue and over 30% year-over-year growth, Agilisium is one of the fastest-growing IT solution providers in Southern California. Our most important asset? People. Talent management plays a vital role in our business strategy. We’re looking for “drivers”; big thinkers with growth and strategic mindset — people who are committed to customer obsession, aren’t afraid to experiment with new ideas. And we are all about finding and nurturing individuals who are ready to do great work. At Agilisium, you’ll collaborate with great minds while being challenged to meet and exceed your potential
Posted 1 week ago
0.0 - 6.0 years
0 Lacs
saidapet, chennai, tamil nadu
On-site
Job Information Date Opened 09/08/2025 City Saidapet Country India Job Role Application Engineering State/Province Tamil Nadu Industry IT Services Job Type Full time Zip/Postal Code 600096 Job Description Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer, you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python, SQL, AWS, PySpark, and Databricks, you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark, Databricks, and SQL on AWS cloud platforms. Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark. Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3, Glue, EMR, Athena, Redshift, and Lambda. Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow, Databricks Jobs, or AWS Step Functions. Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability, monitoring, and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight. Work Environment & Collaboration: We value a hybrid, collaborative environment that encourages shared learning and innovation. You will work closely with product owners, architects, analysts, and data scientists across geographies to solve real-world business problems using cutting-edge technologies and methodologies. We encourage flexibility while maintaining a strong in-office presence for better team synergy and innovation. About Agilisium - Agilisium, is an AWS technology Advanced Consulting Partner that enables companies to accelerate their "Data-to-Insights-Leap. With $50+ million in annual revenue and over 30% year-over-year growth, Agilisium is one of the fastest-growing IT solution providers in Southern California. Our most important asset? People. Talent management plays a vital role in our business strategy. We’re looking for “drivers”; big thinkers with growth and strategic mindset — people who are committed to customer obsession, aren’t afraid to experiment with new ideas. And we are all about finding and nurturing individuals who are ready to do great work. At Agilisium, you’ll collaborate with great minds while being challenged to meet and exceed your potential
Posted 1 week ago
0.0 - 6.0 years
0 Lacs
saidapet, chennai, tamil nadu
On-site
Job Information Date Opened 09/08/2025 City Saidapet Country India Job Role Application Engineering State/Province Tamil Nadu Industry IT Services Job Type Full time Zip/Postal Code 600096 Job Description Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer, you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python, SQL, AWS, PySpark, and Databricks, you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark, Databricks, and SQL on AWS cloud platforms. Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark. Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3, Glue, EMR, Athena, Redshift, and Lambda. Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow, Databricks Jobs, or AWS Step Functions. Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability, monitoring, and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight. Work Environment & Collaboration: We value a hybrid, collaborative environment that encourages shared learning and innovation. You will work closely with product owners, architects, analysts, and data scientists across geographies to solve real-world business problems using cutting-edge technologies and methodologies. We encourage flexibility while maintaining a strong in-office presence for better team synergy and innovation. About Agilisium - Agilisium, is an AWS technology Advanced Consulting Partner that enables companies to accelerate their "Data-to-Insights-Leap. With $50+ million in annual revenue and over 30% year-over-year growth, Agilisium is one of the fastest-growing IT solution providers in Southern California. Our most important asset? People. Talent management plays a vital role in our business strategy. We’re looking for “drivers”; big thinkers with growth and strategic mindset — people who are committed to customer obsession, aren’t afraid to experiment with new ideas. And we are all about finding and nurturing individuals who are ready to do great work. At Agilisium, you’ll collaborate with great minds while being challenged to meet and exceed your potential
Posted 1 week ago
5.0 years
0 Lacs
telangana
On-site
Your Key Responsibilities: Your responsibilities include, but are not limited to: Develop architectural solutions for data and integration platforms. Deliver architectural services and governance activities across projects. Apply architecture patterns to optimize platform utilization. Lead teams in architecture modeling and documentation (diagrams, blueprints). Ensure compliance with GxP, FDA, EMA, and other regulatory frameworks. Maintain and evolve architecture tools, principles, policies, and standards. Collaborate with internal and vendor teams to ensure technical governance. Support operational and cost efficiency through architectural best practices. Key Performance Indicator: Consistent domain/platform architecture and documented patterns. What You’ll Bring to the Role: Essential Requirements: Bachelor’s degree in computer science, Information Systems, Engineering, or related field (master’s preferred). 5+ years of IT experience, with 2+ years in solution architecture. Experience in pharmaceutical, biotech, or life sciences industries. Proficiency in architecture tools and modeling notations (ArchiMate, UML, BPMN, SAP LeanIX). Understanding of conceptual, logical, and physical data models. Knowledge of hyperscaler architectures for data lakes/warehouses. Experience with platforms like Databricks, Snowflake, SAP Business Data Cloud. Integration experience with on-prem, SaaS, and business platforms (SAP, Salesforce, Veeva). Familiarity with data privacy, ethics, and regulatory frameworks (GDPR, HIPAA, GxP). Strong consulting and enterprise architecture skills. Fluent in English. Desirable Requirements: Experience with master data management platforms (TIBCO EBX, SAP MDG). Knowledge of data masking/cataloging tools (Collibra, Informatica IDMC). You’ll Receive: Flexible working arrangements including hybrid options. Access to global learning and development programs. Opportunities to work on impactful, cross-functional projects. A collaborative and inclusive work culture that supports personal and professional growth. Why Sandoz? Generic and Biosimilar medicines are the backbone of the global medicines industry. Sandoz, a leader in this sector, provided more than 900 million patient treatments across 100+ countries in 2024 and while we are proud of this achievement, we have an ambition to do more! With investments in new development capabilities, production sites, new acquisitions, and partnerships, we have the opportunity to shape the future of Sandoz and help more patients gain access to low-cost, high-quality medicines, sustainably. Our momentum is powered by an open, collaborative culture driven by our talented and ambitious colleagues, who, in return for applying their skills experience an agile and collegiate environment with impactful, flexible-hybrid careers, where diversity is welcomed and where personal growth is supported! Join us!
Posted 1 week ago
5.0 years
0 Lacs
mumbai, maharashtra, india
On-site
Job Summary: We are seeking an experienced senior Data Governance Executive to join the Chief Data Office. The individual will closely work with the Data Governance Lead to help with the development, implementation, and maintenance of our data governance framework, policies, and procedures across the organization. The Senior Data Governance Executive will work closely with data owners, stewards, technology teams, and business stakeholders to establish and enforce data standards, policies, and best practices that aligns with industry best practices and regulatory requirements. Key Responsibilities: Develop and Implement Data Governance Framework: Develop and maintain or contribute to a data governance framework that includes data standards, data quality rules, metadata management, and data stewardship practices and guidelines. Act as a bridge between the enterprise Data Office and the divisional /business data governance stakeholders Partner with the business functions to drive data maturity assessments and remediation plans. Data Quality and Integrity: Develop and implement data quality metrics and monitoring processes to ensure data accuracy, completeness, and consistency. Collaborate with data owners and stewards to identify and address data quality issues. Ensure that data governance practices align with regulatory requirements Collaborate with the information security team to ensure that data security controls are in place and effective. Data Stewardship: Identify and appoint data stewards to oversee specific data assets and ensure that they understand their roles and responsibilities. Provide training and support to data stewards to ensure that they can effectively manage and maintain data assets. Metadata and Lineage Management: Guide adoption and optimization of data governance tools like OpenMetadata Implement processes for capturing metadata, data lineage, and usage across key data assets Identify and document business data definitions (business glossary) and ownership including classification of data. Reporting and Monitoring: Define and track KPIs for data governance performance and communicate progress to executive leadership. Collaboration and Communication: Collaborate with business leaders, data owners, and IT teams to ensure that data governance requirements are integrated into business processes and systems. Communicate data governance policies, procedures, and issues to stakeholders through various channels (e.g., training, newsletters, meetings). Requirements: Bachelor’s degree in computer science, Information Technology, Business Administration, or related field. Minimum 5 years of experience in data governance, data management, preferably in the financial services industry. Strong understanding of data governance principles, frameworks (e.g., DAMA-DMBOK, DCAM) and best practices. Hands-on knowledge of data cataloguing tools (e.g., Collibra, Informatica, Alation etc.), data lineage and data quality Experience of data maturity assessments and familiarity with data modelling and metadata management Collaboration & Communication Strong documentation and requirement gathering skills to align technical governance with business objectives Skilled in stakeholder management and articulating data governance value to non-technical audiences Experience driving adoption of governance practices across enterprise functions Personality Traits & Leadership Self-driven, with the ability to take ownership and lead governance initiatives independently Process-oriented thinker with a structured approach to problem-solving Proven ability to collaborate across business and technical teams, influence without authority, and drive enterprise-wide initiatives. Strong analytical and problem-solving skills. Nice to Have: Certification in data governance, data management, or a related field (e.g., CDMP, DCAM).
Posted 1 week ago
8.0 years
0 Lacs
pune, maharashtra, india
On-site
Requirements: Bachelor’s degree in computer science, Information Technology, Data Management, Business Administration, or related field. Minimum 8 years of hand-on experience in data governance, data management, preferably in the financial services industry. Strong understanding of data governance principles, frameworks (e.g., DAMA-DMBOK, DCAM) and best practices. Hands-on knowledge of data cataloguing tools (e.g., Collibra, Informatica, Alation etc.), data lineage and data quality Experience of data maturity assessments and familiarity with data modelling and metadata management Collaboration & Communication Strong documentation and requirement gathering skills to align technical governance with business objectives Skilled in stakeholder management and articulating data governance value to non-technical audiences Experience driving adoption of governance practices across enterprise functions Personality Traits & Leadership Self-driven, with the ability to take ownership and lead governance initiatives independently Process-oriented thinker with a structured approach to problem-solving Proven ability to collaborate across business and technical teams, influence without authority, and drive enterprise-wide initiatives. Strong analytical and problem-solving skills. Nice to Have: Certification in data governance, data management, or a related field (e.g., CDMP, DCAM).
Posted 1 week ago
4.0 years
0 Lacs
pune, maharashtra, india
On-site
Key Accountabilities Investigate, troubleshoot, and resolve data related production issues. Provide timely reporting on data quality metrics and trends. Document and maintain support procedures for data quality processes. Collaborate with IT and business teams to implement data quality improvements. Ensure data validation and reconciliation processes are followed. Engage with stakeholders to establish procedures for data validation and quality metrics. Track data issues using incident tickets and ensure timely resolution or escalate issues for immediate attention if not resolved. Maintain and update production support dashboards (Microsoft Power BI) to ensure accuracy and meet monitoring requirements. Develop Data Quality health reports for stakeholders to monitor and observe data reliability across the platform. Creating and maintaining documentation procedures, and best practices of data governance and related processes Provide training to users on tools to promote awareness and adherence. Collaborating with data owners and data stewards to ensure data governance is implemented and followed. Able to work with vendor as there will be technical platform issues that requires coordination and solution. Deliver consistent, accurate and high- quality work while communicating findings and insights in a clear manner. Experience / Qualifications At least 4 years of hands-on experience with a Data Quality tool (Collibra is preferred), Databricks and Microsoft Power BI Strong technical skills in data and database management, with proficiency in data wrangling, analytics, and transformation using Python and SQL Asset Management experience will be beneficial to understand and recommend the required data quality rules and remediation plan to the stakeholders. Other Attributes Curious, analytical, and able to think critically to solve problems Detail-oriented and comfortable dealing with complex structured and unstructured datasets Customer-centric and strive to deliver value by effectively and proactively engaging stakeholders Clear and effective communication skills, with an ability to communicate complex ideas and manage stakeholder expectations Strong organisational and prioritisation skills, adaptable and able to work independently as required
Posted 1 week ago
5.0 years
0 Lacs
telangana, india
On-site
Job Description Your Key Responsibilities: Your Responsibilities Include, But Are Not Limited To Develop architectural solutions for data and integration platforms. Deliver architectural services and governance activities across projects. Apply architecture patterns to optimize platform utilization. Lead teams in architecture modeling and documentation (diagrams, blueprints). Ensure compliance with GxP, FDA, EMA, and other regulatory frameworks. Maintain and evolve architecture tools, principles, policies, and standards. Collaborate with internal and vendor teams to ensure technical governance. Support operational and cost efficiency through architectural best practices. Key Performance Indicator Consistent domain/platform architecture and documented patterns. Essential Requirements What You’ll Bring to the Role: Bachelor’s degree in computer science, Information Systems, Engineering, or related field (master’s preferred). 5+ years of IT experience, with 2+ years in solution architecture. Experience in pharmaceutical, biotech, or life sciences industries. Proficiency in architecture tools and modeling notations (ArchiMate, UML, BPMN, SAP LeanIX). Understanding of conceptual, logical, and physical data models. Knowledge of hyperscaler architectures for data lakes/warehouses. Experience with platforms like Databricks, Snowflake, SAP Business Data Cloud. Integration experience with on-prem, SaaS, and business platforms (SAP, Salesforce, Veeva). Familiarity with data privacy, ethics, and regulatory frameworks (GDPR, HIPAA, GxP). Strong consulting and enterprise architecture skills. Fluent in English. Desirable Requirements Experience with master data management platforms (TIBCO EBX, SAP MDG). Knowledge of data masking/cataloging tools (Collibra, Informatica IDMC). You’ll Receive Flexible working arrangements including hybrid options. Access to global learning and development programs. Opportunities to work on impactful, cross-functional projects. A collaborative and inclusive work culture that supports personal and professional growth. Why Sandoz? Generic and Biosimilar medicines are the backbone of the global medicines industry. Sandoz, a leader in this sector, provided more than 900 million patient treatments across 100+ countries in 2024 and while we are proud of this achievement, we have an ambition to do more! With investments in new development capabilities, production sites, new acquisitions, and partnerships, we have the opportunity to shape the future of Sandoz and help more patients gain access to low-cost, high-quality medicines, sustainably. Our momentum is powered by an open, collaborative culture driven by our talented and ambitious colleagues, who, in return for applying their skills experience an agile and collegiate environment with impactful, flexible-hybrid careers, where diversity is welcomed and where personal growth is supported! Join us!
Posted 1 week ago
6.0 - 10.0 years
9 - 19 Lacs
hyderabad, chennai, bengaluru
Hybrid
Must have worked on Collibra Data Governance implementation/support project. Must have good knowledge of Collibra. Understand the functional and technical requirements. Map the business and technical requirements to Collibra Functionalities Identify Key data attributes and define data quality rules around them in accordance with client requirements. Configure Collibra DGC based the technical and functional requirements pertaining to Business Glossary / data taxonomy, data ownership and workflows. Fix SIT and UAT bugs Establish the integration with interfacing systems through the Collibra connectors available in Collibra connect models Should have knowledge to create custom adapters to integrate Collibra with other products like Service now, HP ALM Support Production deployment and Warranty support phases
Posted 1 week ago
10.0 years
0 Lacs
pune, maharashtra, india
On-site
The Data Analytics Sr Lead Analyst is a strategic professional who closely follows latest trends in own field and adapts them for application within own job and the business. Typically a small number of people within the business that provide the same level of expertise. Excellent communication skills required in order to negotiate internally, often at a senior level. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Accountable for significant direct business results or authoritative advice regarding the operations of the business. Necessitates a degree of responsibility over technical strategy. Primarily affects a sub-function. Responsible for handling staff management issues, including resource management and allocation of work within the team/project. Responsibilities: Adapts latest trends in Data Science/Data Mining/Big Data field for application within role and the business. Serves as the subject matter expert for strategic data analysis, identifies insights and implications as well as make strategic recommendations, develops data displays that clearly communicate complex analysis. Demonstrates excellent communication skills in order to negotiate internally, often at a senior level, leveraging diplomacy in order to guide, influence and convince internal and external stakeholders. Applies ingenuity and creativity to problem analysis and resolution in both complicated and novel contexts. Employs sophisticated analytical thought comparing and selecting complex alternatives to make refined judgements and sound solutions. Consults on business operations offering authoritative advice to influence significant business outcomes. Responsible for informing and implementing the sub function’s technical strategy. Responsible for handling staff management issues, including resource management and allocation of work within the team/project. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 10+ years of relevant experience Proficient with Hadoop, Python, Tableau, Collibra, Data Quality, SAS Tools with strong understanding of Data Governance Demonstrated leadership Consistently demonstrate clear and concise written and verbal communication Proven stakeholder management skills with ability to hold themselves in discussion with Senior Stakeholders Proven interpersonal skills with ability to partner and influence across organizational lines Proven ability of using complex analytical, interpretive and problem-solving techniques Education: Bachelor’s/University degree or equivalent experience, potentially Masters degree This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Analytics ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
3.0 - 15.0 years
7 - 25 Lacs
hyderabad, chennai, bengaluru
Work from Office
Roles and Responsibilities : Design, develop, and maintain metadata frameworks for data governance across various business units. Collaborate with stakeholders to identify and define data quality requirements, ensuring compliance with industry standards. Develop and implement effective data governance policies, procedures, and best practices to ensure high-quality data management. Provide training and support to end-users on metadata management tools and processes. Job Requirements : 3-15 years of experience in Metadata Management or related field (Data Governance). Strong understanding of data quality principles, including data validation, cleansing, and profiling techniques. Location- PAN INDIA Experience with developing metadata frameworks using industry-standard tools such as Informatica IDQ or similar technologies.
Posted 1 week ago
5.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Position: Collibra Developer Location: Chennai Interview Mode: Virtual Max CTC: 15 LPA Experience: Total 5+ Years | Relevant 4.5+ Years Notice Period: Immediate Joiners Job Description: Hands-on experience in configuring and customizing the Collibra platform to meet business and data governance requirements. Must have experience in enterprise-level implementation . Experience in the finance domain is preferred.
Posted 1 week ago
5.0 years
0 Lacs
gurugram, haryana, india
On-site
Primary Responsibilities JOB DESCRIPTION Collaborate with other departments to identify and resolve data discrepancies and ensure data accuracy. Operationalize Master Data maintenance and support. Maintain and manage customers/accounts and Products/SKUs in SAP, Oracle, Microsiga, ePIM, eDAM and SFDC. Maintain master data to support interfaces such as Titan Platform, SFDC, etc. Maintain and understand region-specific data settings and processes. Analyze and maintain accurate data within our organization's systems, ensuring data integrity and compliance with policies and procedures. Create and maintain data quality rules. Train and mentor junior teammates around processes. Validate master data create/change requests are complete and adhere to global and regional compliance standards. Support data audit teams (Internal & External). Incorporate SAP, ePIM, eDAM and SFDC data management into the central organization. Support business initiatives by rationalizing and standardizing master data. Communicate with internal stakeholders to resolve any data-related issues. Responsibilities Data Quality: Manage Data Quality exception report actions. Play a pivotal role in driving change management initiatives and engaging in negotiations with key stakeholders to ensure the consistent maintenance of data quality. Perform data profiling. Support ad hoc data requests and assist with data analysis projects. Remediate compliance data quality issues in Master Data accounts found via DQ exception reports coming from the Data OpEx team or reported ad hoc by business users. Create and maintain standard work job aides for GBE/Regional master data standards. Develop and implement strategies to improve data quality and ensure timely and accurate data entry. Qualifications Bachelor’s degree in computer science, Information Management, Applied Mathematics, or a related field. 5+ years of experience in data management or a related field. Proficient in advanced Microsoft Excel. Experience of working on SAP, SFDC, ePIM, eDAM specially around Master data domain. Knowledge of quote-to-cash processes. Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders. Detail-oriented with strong organizational skills and the ability to work independently. Strong problem-solving skills with the ability to think creatively and find solutions to complex data-related issues. Strong interpersonal skills and the ability to influence others. Demonstrated ability to collaborate with various people and organizations to develop win/win results. Ability to work in a globally dispersed team. We Value Experience with data analysis tools (e.g., SQL etc.). Experience with anyone of the CRM systems e.g., SFDC. Understanding Master Data model and its business impact. Experience working with Tableau and/or Power BI as visualization tools. Experience working with Collibra for Data Definitions. Understanding of best-in-class model and data configuration and development processes. Excellent collaboration and negotiation skills. Conveys specific, observable, and/or measurable expectations for each assignment and verifies understanding and agreement on deliverables and timeframes. Consistently makes timely decisions even in the face of complexity, balancing systematic analysis with decisiveness. About Us Honeywell helps organizations solve the world's most complex challenges in automation, the future of aviation and energy transition. As a trusted partner, we provide actionable solutions and innovation through our Aerospace Technologies, Building Automation, Energy and Sustainability Solutions, and Industrial Automation business segments – powered by our Honeywell Forge software – that help make the world smarter, safer and more sustainable.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra, india
On-site
Job Title - Data Catalogue Specialist Experience - 3-7 Years Required Skills & Experience: Hands-on experience with Collibra Data Intelligence Platform, including: Metadata ingestion Data lineage stitching Workflow configuration and customization Strong understanding of metadata management and data governance principles Experience working with the following data sources/tools: Teradata (BTEQ, MLOAD) Tableau QlikView IBM DataStage Informatica Ability to interpret and map technical metadata from ETL tools, BI platforms, and databases into Collibra Familiarity with data lineage concepts , including horizontal lineage across systems Proficiency in SQL and scripting for metadata extraction and transformation Excellent communication skills to collaborate with data stewards, engineers, and business stakeholders Preferred Qualifications: Experience with Collibra APIs or Connectors Knowledge of data governance frameworks (e.g., DAMA DMBOK) Prior experience in a regulated industry (e.g., finance, healthcare)
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
kolkata, west bengal
On-site
You are currently hiring for a "Data Governance Analyst" position in a leading Bank, located in Kolkata with an option for work from the office. To be considered for this role, you should have a minimum of 5+ years of experience in enterprise data governance. You should also have experience working with Data warehouse technologies and data governance solutions such as Data Catalog, MDM, and Data Quality. Additionally, you must possess at least 3+ years of practical experience configuring business glossaries, dashboards, policies, search, and Data maps. In this role, you will be expected to have 3+ years of experience in Data Standardization, Cleanse, transform, and parse data. You will be responsible for developing data standardization mapplets and mappings. It would be advantageous to have a working knowledge of Data Governance tools like Informatica, Collibra, etc. Furthermore, having certifications in DAMA, EDM Council, IQINT would be beneficial. Knowledge of AI/ML and their application in Data Governance is also considered a plus for this position. If you are interested in this opportunity, please share your resume with bhumika@peoplemint.in.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
We are seeking an Ataccama Admin to become a valuable part of our team, assisting in the management and upkeep of our Ataccama data quality and data governance platform. Your responsibilities will include the installation, configuration, and maintenance of Ataccama on the AWS/Azure platform. Additionally, you will play a key role in developing and executing data quality rules and policies. A strong grasp of Ataccama architecture and best practices is essential, along with prior experience in data management and data governance. Familiarity with Collibra and Immuta administration would be advantageous, and proficiency in overseeing VMs, as well as both Windows and Linux systems, is required. Previous experience in the pharmaceutical sector is preferred. In this role, your duties will encompass the development and enforcement of data quality rules and policies, monitoring and reporting on data quality metrics, troubleshooting and resolving Ataccama-related issues, and staying informed on the latest features and best practices of Ataccama. Collaborating with cross-functional teams to implement data governance policies and procedures will be a key aspect of your responsibilities. You will also be tasked with managing and maintaining VMs and systems based on Windows and Linux platforms, overseeing redundancy, backup, and recovery plans and processes, and demonstrating a robust knowledge of AWS/Azure. The ideal candidate should possess a minimum of 4 years of experience working with Ataccama, along with a background in data management and data governance. Prior experience in administering Collibra and Immuta would be advantageous. Proficiency in managing VMs, Windows, and Linux systems, as well as experience in Performance Tuning, are essential requirements. Strong analytical and problem-solving skills, excellent communication, and the ability to work effectively in a team are all qualities we are looking for in potential candidates. A crucial requirement for this role is proficiency in Ataccama ONE administration, demonstrating your ability to effectively manage the Ataccama environment.,
Posted 1 week ago
4.0 - 9.0 years
10 - 17 Lacs
bangalore rural, bengaluru
Work from Office
Job Summary: We are seeking a skilled Data Governance Specialist with a strong focus on Data Quality to lead and support enterprise-wide data governance initiatives. The ideal candidate will have hands-on experience with leading DG tools, a solid understanding of industry frameworks, and expertise in managing master data and ERP systems. Role & responsibilities Implement and manage Data Governance and Data Quality solutions using tools such as Collibra, Alation, Informatica, or Microsoft Purview . Apply industry-standard frameworks like DCAM and DAMA DMBOK to design and enhance governance processes. Drive data quality initiatives including matching, merging, and creation of golden records for master data entities. Collaborate with cross-functional teams to integrate DG practices into ERP systems and ETL pipelines. Maintain metadata repositories and ensure data lineage and stewardship practices are followed. Required Skills & Qualifications: Proficiency in at least three DG tools : Collibra, Alation, Informatica, Purview. Strong understanding of Data Governance frameworks (DCAM, DAMA DMBOK). Certifications such as CDMP, DCAM, Collibra Ranger, or IDGC are highly desirable. Hands-on experience in MDM, ETL, and DG implementations . Familiarity with ERP systems and enterprise data architecture.
Posted 1 week ago
4.0 - 5.0 years
5 - 6 Lacs
gurgaon
On-site
About Our Team Our global team supports products education electronic health records that introduce students to digital charting and prepare them to document care in today’s modern clinical environment. We have a very stable product that we've worked to get to and strive to maintain. Our team values trust, respect, collaboration, agility, and quality. The Consumption Domain is a newly established domain, offering an exciting opportunity to play a crucial role in structuring and shaping its foundation. Our team is responsible for ensuring seamless data processing, validation, and operational efficiency, while continuously improving workflow optimization and incident management. We work closely with various stakeholders to drive accuracy, speed, and reliability in delivering high-quality data. With a problem-solving mindset and a data-driven approach, we aim to build scalable solutions that enhance business processes and improve overall user experience. About the Role Elsevier is looking for a Senior Analyst to join the Consumption Domain team, where you will play a crucial role in analyzing and interpreting user engagement and content consumption trends. The ideal candidate will possess strong technical expertise in data analytics, Databricks, ETL processes, and cloud storage, coupled with a passion for using data to drive meaningful business decisions. Responsibilities Analyze and interpret large datasets to provide actionable business insights. Leverage Databricks, working with RDDs, Data Frames, and Datasets to optimize data workflows. Design and implement ETL processes, job automation, and data optimization strategies. Work with structured, unstructured, and semi-structured data types, including JSON, XML, and RDF. Manage various file formats (Parquet, Delta files, ZIP files) and handle data storage within DBFS, FileStore, and cloud storage solutions (AWS S3, Google Cloud Storage, etc.). Write efficient SQL, Python, or Scala scripts to extract and manipulate data. Develop insightful dashboards using Tableau or Power BI to visualize key trends and performance metrics. Collaborate with cross-functional teams to drive data-backed decision-making. Maintain best practices in data governance, utilizing platforms such as Snowflake and Collibra. Participate in Agile development methodologies, using tools like Jira, Confluence. Ensure proper version control using GitHub. Requirements Bachelor’s degree in Computer Science, Data Science, or Statistics. Minimum of 4-5 years of experience in data analytics, preferably in a publishing environment. Proven expertise in Databricks, including knowledge of RDDs, DataFrames, and Datasets. Strong understanding of ETL processes, data optimization, job automation, and Delta Lake. Proficiency in handling structured, unstructured, and semi-structured data. Experience with various file types, including Parquet, Delta, and ZIP files. Familiarity with DBFS, FileStore, and cloud storage solutions such as AWS S3, Google Cloud Storage. Strong programming skills in SQL, Python, or Scala. Experience creating dashboards using Tableau or Power BI is a plus. Knowledge of Snowflake, Collibra, JSON-LD, SHACL, SPARQL is an advantage. Familiarity with Agile development methodologies, including Jira and Confluence. Experience with GitLab for version control is beneficial. Skills and Competencies Ability to handle large datasets efficiently. Strong analytical and problem-solving skills. Passion for data-driven decision-making and solving business challenges. Eagerness to learn new technologies and continuously improve processes. Effective communication and data storytelling abilities. Experience collaborating with cross-functional teams. Project management experience in software systems is a plus. Work in a way that works for you We promote a healthy work/life balance across the organization. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance, and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai)" About the Business A global leader in information and analytics, we help researchers and healthcare professionals advance science and improve health outcomes for the benefit of society. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science and research, health education and interactive learning, as well as exceptional healthcare and clinical practice. What you do every day will help advance science and healthcare to advance human progress. - We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here . Please read our Candidate Privacy Policy . We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. USA Job Seekers: EEO Know Your Rights .
Posted 1 week ago
4.0 - 5.0 years
2 - 6 Lacs
gurgaon
On-site
About Our Team Our global team supports products education electronic health records that introduce students to digital charting and prepare them to document care in today’s modern clinical environment. We have a very stable product that we've worked to get to and strive to maintain. Our team values trust, respect, collaboration, agility, and quality. The Consumption Domain is a newly established domain, offering an exciting opportunity to play a crucial role in structuring and shaping its foundation. Our team is responsible for ensuring seamless data processing, validation, and operational efficiency, while continuously improving workflow optimization and incident management. We work closely with various stakeholders to drive accuracy, speed, and reliability in delivering high-quality data. With a problem-solving mindset and a data-driven approach, we aim to build scalable solutions that enhance business processes and improve overall user experience. About the Role Elsevier is looking for a Senior Analyst to join the Consumption Domain team, where you will play a crucial role in analyzing and interpreting user engagement and content consumption trends. The ideal candidate will possess strong technical expertise in data analytics, Databricks, ETL processes, and cloud storage, coupled with a passion for using data to drive meaningful business decisions. Responsibilities Analyze and interpret large datasets to provide actionable business insights. Leverage Databricks, working with RDDs, Data Frames, and Datasets to optimize data workflows. Design and implement ETL processes, job automation, and data optimization strategies. Work with structured, unstructured, and semi-structured data types, including JSON, XML, and RDF. Manage various file formats (Parquet, Delta files, ZIP files) and handle data storage within DBFS, FileStore, and cloud storage solutions (AWS S3, Google Cloud Storage, etc.). Write efficient SQL, Python, or Scala scripts to extract and manipulate data. Develop insightful dashboards using Tableau or Power BI to visualize key trends and performance metrics. Collaborate with cross-functional teams to drive data-backed decision-making. Maintain best practices in data governance, utilizing platforms such as Snowflake and Collibra. Participate in Agile development methodologies, using tools like Jira, Confluence. Ensure proper version control using GitHub. Requirements Bachelor’s degree in Computer Science, Data Science, or Statistics. Minimum of 4-5 years of experience in data analytics, preferably in a publishing environment. Proven expertise in Databricks, including knowledge of RDDs, DataFrames, and Datasets. Strong understanding of ETL processes, data optimization, job automation, and Delta Lake. Proficiency in handling structured, unstructured, and semi-structured data. Experience with various file types, including Parquet, Delta, and ZIP files. Familiarity with DBFS, FileStore, and cloud storage solutions such as AWS S3, Google Cloud Storage. Strong programming skills in SQL, Python, or Scala. Experience creating dashboards using Tableau or Power BI is a plus. Knowledge of Snowflake, Collibra, JSON-LD, SHACL, SPARQL is an advantage. Familiarity with Agile development methodologies, including Jira and Confluence. Experience with GitLab for version control is beneficial. Skills and Competencies Ability to handle large datasets efficiently. Strong analytical and problem-solving skills. Passion for data-driven decision-making and solving business challenges. Eagerness to learn new technologies and continuously improve processes. Effective communication and data storytelling abilities. Experience collaborating with cross-functional teams. Project management experience in software systems is a plus. Work in a way that works for you We promote a healthy work/life balance across the organization. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance, and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai)" About the Business A global leader in information and analytics, we help researchers and healthcare professionals advance science and improve health outcomes for the benefit of society. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science and research, health education and interactive learning, as well as exceptional healthcare and clinical practice. What you do every day will help advance science and healthcare to advance human progress. - We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here . Please read our Candidate Privacy Policy . We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. USA Job Seekers: EEO Know Your Rights .
Posted 1 week ago
4.0 - 5.0 years
0 Lacs
gurugram, haryana, india
On-site
About Our Team Our global team supports products education electronic health records that introduce students to digital charting and prepare them to document care in today’s modern clinical environment. We have a very stable product that we've worked to get to and strive to maintain. Our team values trust, respect, collaboration, agility, and quality. The Consumption Domain is a newly established domain, offering an exciting opportunity to play a crucial role in structuring and shaping its foundation . Our team is responsible for ensuring seamless data processing, validation, and operational efficiency , while continuously improving workflow optimization and incident management . We work closely with various stakeholders to drive accuracy, speed, and reliability in delivering high-quality data. With a problem-solving mindset and a data-driven approach , we aim to build scalable solutions that enhance business processes and improve overall user experience. About The Role Elsevier is looking for a Senior Analyst to join the Consumption Domain team, where you will play a crucial role in analyzing and interpreting user engagement and content consumption trends. The ideal candidate will possess strong technical expertise in data analytics, Databricks, ETL processes, and cloud storage, coupled with a passion for using data to drive meaningful business decisions. Responsibilities Analyze and interpret large datasets to provide actionable business insights. Leverage Databricks, working with RDDs, Data Frames, and Datasets to optimize data workflows. Design and implement ETL processes, job automation, and data optimization strategies. Work with structured, unstructured, and semi-structured data types, including JSON, XML, and RDF. Manage various file formats (Parquet, Delta files, ZIP files) and handle data storage within DBFS, FileStore, and cloud storage solutions (AWS S3, Google Cloud Storage, etc.). Write efficient SQL, Python, or Scala scripts to extract and manipulate data. Develop insightful dashboards using Tableau or Power BI to visualize key trends and performance metrics. Collaborate with cross-functional teams to drive data-backed decision-making. Maintain best practices in data governance, utilizing platforms such as Snowflake and Collibra. Participate in Agile development methodologies, using tools like Jira, Confluence. Ensure proper version control using GitHub. Requirements Bachelor’s degree in Computer Science, Data Science, or Statistics. Minimum of 4-5 years of experience in data analytics, preferably in a publishing environment. Proven expertise in Databricks, including knowledge of RDDs, DataFrames, and Datasets. Strong understanding of ETL processes, data optimization, job automation, and Delta Lake. Proficiency in handling structured, unstructured, and semi-structured data. Experience with various file types, including Parquet, Delta, and ZIP files. Familiarity with DBFS, FileStore, and cloud storage solutions such as AWS S3, Google Cloud Storage. Strong programming skills in SQL, Python, or Scala. Experience creating dashboards using Tableau or Power BI is a plus. Knowledge of Snowflake, Collibra, JSON-LD, SHACL, SPARQL is an advantage. Familiarity with Agile development methodologies, including Jira and Confluence. Experience with GitLab for version control is beneficial. Skills And Competencies Ability to handle large datasets efficiently. Strong analytical and problem-solving skills. Passion for data-driven decision-making and solving business challenges. Eagerness to learn new technologies and continuously improve processes. Effective communication and data storytelling abilities. Experience collaborating with cross-functional teams. Project management experience in software systems is a plus. Work in a way that works for you We promote a healthy work/life balance across the organization. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance, and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai)" About The Business A global leader in information and analytics, we help researchers and healthcare professionals advance science and improve health outcomes for the benefit of society. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science and research, health education and interactive learning, as well as exceptional healthcare and clinical practice. What you do every day will help advance science and healthcare to advance human progress.
Posted 1 week ago
0 years
0 Lacs
india
On-site
Job Purpose The Data Governance & Quality (DGQ) Engineer plays a pivotal role in ensuring the accuracy,integrity, security, and availability of data across the organization. This role is responsible for automating and scaling data quality checks, integrating metadata and lineage systems, enforcing governance policies through code, and driving engineering solutions to improve observability, compliance, and trust in data assets. The engineer works closely with data platform, security, analytics, and business teams to ensure governance and quality principles are built into data pipelines, cloud infrastructure, and tooling ecosystems. Your Impact: What You’ll Do Implement & Integrate; Design and develop automated data quality frameworks and validation rules across ETL/ELT pipelines. Build and integrate metadata ingestion, lineage tracking, and data classification tools into data platforms. Leverage modern tools like dbt, Great Expectations, Apache Airflow, or Dagster. Maintain robust system and infrastructure documentation for governance components. Security & Strategies; Enforce RBAC and ABAC through automated IAM configurations and policies. Assist in implementing data masking, tokenization, and encryption strategies. Collaborate on cloud-native security practices including audit logging and permission audits. Quality & Reliability; Perform statistical profiling, anomaly detection, and automated outlier identification. Build observability dashboards for data quality metrics, lineage, and freshness.- Implement and monitor SLAs/SLOs for data quality and reliability. Report & Document; Create and maintain governance policies, data dictionaries, and SLA documents.- Generate data quality and compliance reports. Provide engineering input on audits and issue resolutions. Improve & Iterate; Evaluate and adopt modern governance platforms. Contribute to internal governance tooling libraries. Champion DevOps and DataOps best practices. Support initiatives led by the Head of Data Management & Engineering. What We’re Looking For: Skills & Experience Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field. Strong hands-on experience with SQL, Python, and ETL tools (e.g., dbt, Airflow). Familiarity with data governance frameworks and tools like Collibra, Alation, or Apache Atlas. Deep understanding of data warehousing, metadata management, and cloud-native services. Knowledge of security and compliance protocols (GDPR, HIPAA, SOC2). Excellent analytical and documentation skills. Strong communication and collaboration abilities.
Posted 1 week ago
3.0 - 8.0 years
25 - 35 Lacs
pune, gurugram, bengaluru
Hybrid
Job Qualifications Data Catalog Specialist Required Skills & Experience: Hands-on experience with Collibra Data Intelligence Platform, including: Metadata ingestion Data lineage stitching Workflow configuration and customization Strong understanding of metadata management and data governance principles Experience working with the following data sources/tools: Teradata (BTEQ, MLOAD) Tableau QlikView IBM DataStage Informatica Ability to interpret and map technical metadata from ETL tools, BI platforms, and databases into Collibra Familiarity with data lineage concepts , including horizontal lineage across systems Proficiency in SQL and scripting for metadata extraction and transformation Excellent communication skills to collaborate with data stewards, engineers, and business stakeholders Preferred Qualifications: Experience with Collibra APIs or Connectors Knowledge of data governance frameworks (e.g., DAMA DMBOK) Prior experience in a regulated industry (e.g., finance, healthcare)
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |