Jobs
Interviews

4985 Data Governance Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As an ETL Automation Test Engineer with expertise in Python development, ETL testing, and CI/CD implementation, you will play a crucial role in automating manual ETL processes using Python scripting. Your primary focus will be on designing, developing, and executing ETL test cases and scripts to ensure the validation of data pipelines. You will be responsible for performing data validation, reconciliation, and integrity checks across various source and target systems. Your role will involve implementing Python-based test automation frameworks for ETL workflows and integrating automated ETL tests into CI/CD pipelines to facilitate seamless deployments. You will collaborate closely with Data Engineers and DevOps teams to optimize ETL automation processes, debug issues, troubleshoot challenges, and enhance performance, accuracy, and scalability. To excel in this role, you must have a minimum of 5 years of hands-on experience in Python development and scripting. Your strong background in ETL testing, data validation, and data transformation testing, along with proficiency in SQL queries for data validation, will be essential. Additionally, hands-on experience with CI/CD tools such as Jenkins, GitHub Actions, GitLab CI, and Azure DevOps is required. Familiarity with ETL tools like Informatica, Talend, AWS Glue, or DBT, as well as experience working with data warehouses (e.g., Snowflake, Redshift, BigQuery), is highly desirable. You should possess a solid understanding of data pipelines, data governance, and data quality best practices to ensure the successful execution of ETL automation projects. Preferred qualifications for this role include experience with Cloud platforms (AWS, Azure, GCP) for ETL automation, familiarity with Apache Airflow, Kafka, or similar orchestration tools, and exposure to containerization technologies like Docker and Kubernetes. By joining our team, you will have the opportunity to work on cutting-edge ETL automation projects for a top-tier client. We offer a competitive salary, along with ample career growth opportunities, in a remote-friendly and flexible work environment.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineer at Maersk, you will play a crucial role in leading the design and implementation of SAP BI universe solutions. Your responsibilities will include coordinating on architectural decisions, developing best practices for universe development, optimizing universe performance, mentoring junior developers, and troubleshooting complex issues. You will collaborate with data architects to ensure alignment with enterprise data strategy and contribute to various aspects of solution delivery. Additionally, you will identify and implement internal process improvements to enhance efficiency. To excel in this role, you should possess a Bachelor's degree in Computer Science or a related field, along with 5-8 years of experience in SAP BI universe development. Advanced knowledge of SAP BusinessObjects Enterprise platform, proficiency in SQL and database optimization, and familiarity with data warehousing concepts are essential. Strong communication skills, understanding of software life cycle, and exceptional data analysis abilities are also required. Preferred skills for this position include additional certifications in SAP BI/BusinessObjects, experience with SAP HANA optimization, knowledge of modern data architecture patterns, and familiarity with cloud data platforms. Understanding of data governance, agile development methodologies, and experience in enabling connectivity at different levels are also desirable. Technical competencies that will be valuable in this role include expertise in SAP BusinessObjects Enterprise platform, Universe Design Tool (UDT), Information Design Tool (IDT), SQL, data modeling, performance optimization, security implementation, version control systems, and database/data warehousing concepts. Join Maersk, a global integrator of container logistics, and be part of a dynamic environment where you can contribute to industry-leading solutions and work with a diverse team across the globe. This opportunity offers continuous professional and personal development, a challenging yet rewarding atmosphere, and the chance to grow your skills and network within a respected organization. If you require any adjustments during the application process or need special assistance, please contact us at accommodationrequests@maersk.com.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Quality Engineering Lead (Global Markets) in the Markets Division of a leading global bank, you will be responsible for ensuring the accuracy, consistency, and reliability of data across various systems used in trading, risk management, regulatory reporting, and financial analytics on a global scale. Your expertise in capital markets, trading systems, risk analytics, and regulatory compliance will be crucial in establishing enterprise-wide data quality frameworks that enhance trading operations, support risk mitigation, and ensure compliance with regional and global financial regulations. Working with cross-functional teams across multiple geographies, including APAC, EMEA, and NAM, you will play a key role in developing and enforcing global data quality frameworks for front-office, middle-office, and back-office systems supporting multi-asset class trading. Your responsibilities will also include ensuring data integrity across global trading platforms, pricing models, risk engines, and regulatory compliance systems, as well as implementing automated data validation for real-time market data, trade transactions, and post-trade analytics. Collaboration with trading desks, quants, risk, and compliance teams across different time zones will be essential in resolving data anomalies and embedding data quality checks in CI/CD pipelines for front-office trading applications. Additionally, you will define and monitor key data quality metrics across global financial systems, establish data lineage and governance frameworks for regulatory reporting, and ensure compliance with global and regional regulatory frameworks such as Basel III, MiFID II, Dodd-Frank Act, and more. Your role will also involve leading initiatives on AI/ML-based data anomaly detection, providing thought leadership in data observability, automated testing, and cloud-based data quality solutions. With over 10 years of experience in Data Quality Engineering, Data Governance, or Data Testing within a global banking or capital markets environment, you will bring strong domain knowledge of global markets trading infrastructure, multi-asset classes, financial risk modeling, and technical expertise in data quality tools, SQL, Python, big data platforms, and cloud services. In this position, you will have the opportunity to work at a global scale, collaborate with teams across different regions, lead data quality initiatives that drive regulatory compliance and market efficiency, engage with cutting-edge trading technology, and shape the future of data governance in a high-frequency trading and real-time analytics environment. Join us to make a significant impact on the bank's ability to manage financial risk, optimize trading strategies, and meet regulatory requirements in the dynamic world of global capital markets.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

You are invited to apply for the position of Principal Consultant- Data Engineer Reporting (Insurance Domain) at Genpact. As a motivated Data Engineer, you will be responsible for building and maintaining data pipelines, transforming complex insurance data, and delivering high-quality datasets for analytics and reporting. Your role will require technical expertise, a deep understanding of insurance processes such as underwriting, claims, and premiums, as well as hands-on experience with data visualization tools like Power BI. Key Responsibilities Data Integration & Transformation: Design and develop ETL/ELT pipelines for processing insurance data, ensuring data integrity, accuracy, and timeliness for reporting purposes. Cloud & Infrastructure Management: Utilize Azure services to support the ingestion and processing of large datasets, implementing secure and scalable data solutions aligned with cloud best practices. Data Modeling for Reporting: Create optimized data models tailored for Power BI and other reporting platforms, collaborating with stakeholders to define key metrics and KPIs relevant to the insurance domain. Collaboration with Business Teams: Partner with business analysts, actuaries, and data architects to translate reporting requirements into technical deliverables, ensuring alignment of data pipelines with business needs. Data Governance & Quality Assurance: Implement data validation checks, support data security initiatives, and comply with insurance regulations and standards. Visualization & Reporting Support: Provide clean datasets for reporting tools like Power BI, create sample dashboards and reports to validate data accuracy and usability. Performance Optimization: Optimize data pipelines for performance and cost-effectiveness in cloud environments, regularly reviewing infrastructure for scaling reporting solutions. Qualifications Minimum Qualifications: Bachelor's degree in Computer Science, Mathematics, Data Science, or related fields. Proficiency in Azure services, ETL/ELT pipeline development, SQL, Python, and data visualization tools like Power BI. Strong understanding of insurance processes and excellent communication and collaboration skills. Preferred Qualifications: Certifications in Azure Data Engineering, Power BI, or equivalent, experience in reporting and analytics in the insurance domain, familiarity with Agile methodologies and CI/CD pipelines using Azure Repo/GitHub. If you possess the required qualifications and skills, and are passionate about data engineering in the insurance domain, we encourage you to apply for this exciting opportunity at Genpact.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Analytics professional with over 5 years of experience, you will be responsible for designing and implementing healthcare data architectures. Your role will involve working with various structured and unstructured data sources, including public healthcare datasets like CMS Medicare and Medicaid claims data, commercial claims, and electronic health records (EHRs). To excel in this role, you must have a Bachelor's degree in computer science, information systems, data engineering, or a related quantitative discipline, with a Master's degree being preferred. You should also possess extensive technical expertise in cloud platforms such as AWS, Azure, GCP, and big data technologies like Databricks, Snowflake, and Hadoop. Proficiency in SQL, Python, and ETL pipeline development is essential for seamless data ingestion, transformation, and integration. Moreover, you should have a deep understanding of data modeling, warehousing, and governance best practices. Experience in implementing data security and compliance frameworks, including HIPAA, HITRUST, and other healthcare regulations, is crucial. Your ability to optimize large-scale data processing and analytics workflows to enhance performance and reliability will be highly valued. In terms of professional experience, a minimum of 10 years in data architecture, data engineering, or a related field is required, preferably within the healthcare or consulting industry. You must have a proven track record of designing and implementing enterprise-scale data solutions to support analytics and business intelligence functions. Strong project management skills are essential, as you will be expected to lead cross-functional teams and oversee complex data initiatives effectively.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

Whether you're at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you'll have the opportunity to expand your skills and make a difference at one of the world's most global banks. We're fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You'll also have the chance to give back and make a positive impact where we live and work through volunteerism. Citi Finance is responsible for the firm's financial management and related controls. We manage and partner on key Citi initiatives and deliverables, such as our quarterly earnings process and ensuring Citi's compliance with financial rules and regulations. The team comprises chief financial officers who partner with each of our businesses and disciplines including controllers, financial planning and analysis, strategy, investor relations, tax, and treasury. We're currently looking for a high caliber professional to join our team as Assistant Vice President, Strategic Ledger Financial Solutions - C12 - Hybrid (Internal Job Title: Assistant Vice President, Strategic Ledger Financial Solutions - C12) based in India. Being part of our team means that we'll provide you with the resources to meet your unique needs, empower you to make healthy decisions, and manage your financial well-being to help plan for your future. In this role, you're expected to: - Manage the requirement gathering, design, and testing activities of the future Financial Reporting solution. - Support the definition and execution of FB&R Reporting and Analytics strategy. - Understand the existing ledger and financial reporting platforms and support the identification of key use cases for integration into the Strategic Financial Reporting platform and advise on the roadmap to decommission legacy capabilities and systems. - Partner with Finance Data Initiative to ensure alignment of FB&R Ledger and Data transformation priorities. - Partner with General Ledger workstream to ensure alignment of Ledger and Chart of Account design, and enablement of reconciliations. - Partner with Solution Architecture workstream to ensure alignment with Citi's technology architecture design principles. - Partner with other horizontal workstreams including Operating model, Process, and Controls to ensure consistency of reporting design. - Build strong relationships with Program Office Leaders, Program Sponsors, and Technology Leaders and provide inputs to defining the future state operating model. As a successful candidate, you'd ideally have the following skills and exposure: - Bachelor's degree in finance, Accounting, Business, Management, or related field is required; CPA or equivalent will be a differentiator. - 8+ years of relevant work experience with global corporations and financial services experience gained in Banking or in an associated consulting role. - Proven track record of working in Finance organizational and technology transformations enabled by SaaS Cloud technologies (Oracle, SAP, Workday, Axiom, Workiva). - Significant experience with the implementation of complex reporting solutions to support both group-wide and local financial, regulatory, and management reporting processes. - Understanding of General Ledger and Chart of Accounts design concepts, Data Governance concepts, ICOFR and SOX, US GAAP, IFRS, and Local GAAP standards. - Pro-active problem-solver, Highly motivated, strong attention to detail, team-oriented, organized. Good understanding of project and program management principles, methods, and techniques. Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you'll have the opportunity to grow your career, give back to your community, and make a real impact. Take the next step in your career, apply for this role at Citi today. This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.,

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

kerala

On-site

At EY, you'll have the opportunity to shape and develop the data analytics and visualization function, making business intelligence a key driver of objective decision-making across various functions. With over 12 years of experience in information analysis and various analytics streams, you will play a pivotal role in guiding technical BI teams, developing data warehouse solutions, and managing multi-disciplinary analytics teams. Your expertise will be crucial in understanding and articulating business intelligence problems, leading multiple projects independently, and collaborating with stakeholders to translate business requirements into data solutions. In this role, you will be responsible for defining data quality standards, managing data governance processes, and providing expert knowledge to support solution design and integration with existing business processes. Additionally, you will lead a team of data stewards to ensure the maintenance and support of data governance processes, hierarchy, and master data definitions. Your role will also involve evaluating processes and procedures to maintain quality standards and drive adherence to organizational policies. As a Thought Leader in digital data management disciplines, you will bring a wealth of experience in analytics best practices, business communication, consulting, and quality processes. Your ability to manage complex workloads, analyze data trends, and present creative solutions will be critical in influencing changes in business decision-making and driving business process improvements. You will also play a key role in mentoring and developing a high-performing team of data scientists and engineers to create a positive working environment with maximum learning opportunities and exposure for team members. Your responsibilities will include stakeholder management, client interactions, and business development to drive new prospects for the function. Your expertise in multi-domain industries, client-facing skills, and prior onsite experience will be beneficial in managing relationships with business owners, functional leads, and project teams to ensure their requirements are met. By collaborating with IT peers, you will showcase key functionalities, drive adoption, and assure operational excellence, contributing to EY's mission of building a better working world. Join us at EY and leverage your expertise to shape the future of data analytics and visualization, making a meaningful impact on our clients, people, and society.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Project Manager, you will be responsible for managing and delivering all assigned projects within the agreed time frame, allocated budget, resources, and quality criteria. This includes managing project risks, developing contingency plans, and handling communication with stakeholders. You will define project baselines to measure progress, evaluate deviations, and take corrective actions. Additionally, you will interpret data analyses, develop action plans, and assist in making strategic data-related decisions. You will follow Agile principles and frameworks while working on projects, ensuring all agile disciplines are met. Collaboration with various functions within Orange Business to design solutions leveraging complex Data Analytics, AI, and Intelligent Automation technologies is a key aspect of this role. Leading the design and delivery of Data Analytics, Governance, AI, and automation solutions for the business will be part of your responsibilities. In this role, you are expected to advise stakeholders on Data & AI strategy, detailed use cases, and industry-leading practices. You will formulate and deploy Data Analytics & AI strategies, create blueprints, roadmaps, and oversee solution delivery projects. Being an individual contributor or overseeing a small team effort is essential, along with developing tools and methodologies for practical application of Data & AI technologies. To be successful in this role, you should possess a bachelor's degree in science, computers/IT, engineering, or commerce. Professional certifications in Data Analytics, Governance, Artificial Intelligence, or Project Management (SCRUM, PRINCE2, PMP, Six Sigma, ITIL) will be advantageous. The ideal candidate will have advanced expertise in Data Analytics or Data Science and demonstrate extensive experience in designing and delivering Analytics, Data Quality & Governance, or AI solutions for business. Your ability to work well in a team, communicate effectively with stakeholders, and drive Data & AI strategy initiatives will be crucial. Strong knowledge and experience in database structures, Data Quality & Governance, and AI application landscape are required. You should be able to coordinate multiple projects simultaneously, drive initiatives, and possess excellent interpersonal and communication skills to succeed in this role.,

Posted 1 week ago

Apply

9.0 - 14.0 years

15 - 22 Lacs

pune

Hybrid

Hiring for Data Governance Senior Analyst role for one of our Client on Fixed Term Employment (12 Months) Employment Type: 12 Months Fixed Term Employment Experience: 9+ Years predominantly in data related disciplines such as Data Governance, SAP master Data and data quality in oil and gas or Financial Services Domain Location: Pune Notice Period: Immediate Joiner to 45 Days Roles & Responsibilities: Deep knowledge of SAP ERP and associated data structures Coordinating with Data Stewards/Data Owners to enable identification of Critical data elements for SAP master Data Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Develop both end-state and interim-state architecture for master data, ensuring alignment with business requirements and industry best practices. Define and implement data models that align with business needs and Gather requirements for master data structures. Design scalable and maintainable data models by ensuring data creation through single source of truth Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Key Accountabilities : Collaborate with the Data Governance Manager to advance the data governance agenda. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaborate with IT, data management, and business units to implement data governance best practices and migrate from ECC to S/4 MDG Monitor data governance activities, measure progress, and report on key metrics to senior management. Conduct training sessions and create awareness programs to promote data governance within the organization. Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Account etc. Interested Candidates can share their CVs on below email ID: Nimrah.fatima1@in.ey.com

Posted 1 week ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

kolkata, mumbai, new delhi

Work from Office

Robust understanding of data governance principles, practices, and frameworks Collaborating with Business and IT stakeholders, Data Stewards to define and implement data governance policies, standards, and procedures Strong experience with Collibra / Informatica EDC + Axon Data Governance platform with hands-on experience with configuration, customization, Deployment and administration Experience with SQL, data warehousing, ETL concepts, Python and integration with other systems Managing and maintaining metadata including data lineage, data profiling, and data cataloging Identifying, documenting, and resolving data quality issues, and establishing data quality rules and metrics Develop and maintain workflows, dashboards, and reports within Collibra / IDMC Ensuring compliance with data privacy regulations and internal data governance policies Provide training and support to end-users on the Collibra platform

Posted 1 week ago

Apply

10.0 - 15.0 years

30 - 35 Lacs

kolkata, mumbai, new delhi

Work from Office

Design and optimize cloud-native data architectures on platforms like Databricks and Snowflake, enabling scalable data engineering, advanced analytics, and AI/ML solutions aligned with business needs. Location: Nagpur, Pune, Chennai, Bangalore Type of Employment: Full time Employment Key Result Areas and Activities: Design and implement Lakehouse architectures using Databricks, Delta Lake, and Apache Spark. Lead the development of data pipelines, ETL/ELT processes, and data integration strategies. Collaborate with business and technical teams to define data architecture standards, governance, and security models. Optimize performance and cost-efficiency of Databricks clusters and jobs. Provide technical leadership and mentorship to data engineers and developers. Integrate Databricks with cloud platforms (Azure, AWS, or GCP) and enterprise systems. Evaluate and recommend tools and technologies to enhance the data ecosystem. Ensure compliance with data privacy and regulatory requirements. Contribute to proposal and pre sales activities. Work and Technical Experience: Must-have Skills: Expertise in data engineering, data architecture, or analytics. Hands-on experience on Databricks and Apache Spark. Hands-on experience on Snowflake Strong proficiency in Python, SQL, and PySpark. Deep understanding of Delta Lake, Lakehouse architecture, and data mesh principles. Deep understanding of Data Governance and Unity Catalog. Experience with cloud platforms (Azure preferred, AWS or GCP acceptable). Good to have Skills: Good understanding of the CI/CD pipeline. Working experience with GitHub. Experience in providing data engineering solutions while maintaining balance between architecture requirements,required efforts and customer specific needs in other tools. Qualification: Bachelor s degree in computer science, engineering, or related field Demonstrated continued learning through one or more technical certifications or related methods Over 10+ years of relevant experience in ETL tools Qualities: Proven problem-solving, and troubleshooting abilities, with a high degree of adaptability; well-versed in the latest trends in the data engineering field Ability to handle multiple tasks effectively, maintain a professional attitude, and work well in a team Excellent interpersonal and communication skills, with a customer-focused approach and keen attention to detail

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

kolkata, mumbai, new delhi

Work from Office

We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory, Databricks, Alteryx, Synapse Analytics, Microsoft Fabric, and a strong command of SQL, Python, and Apache Spark. Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory, Synapse Pipelines, and Microsoft Fabric. Responsible for transforming and migrating existing Alteryx workflows into scalable Azure Databricks pipelines, ensuring optimized performance and maintainability. Collaborate with data engineering and analytics teams to redesign Alteryx pipelines into Spark-based solutions on Azure Databricks, leveraging Delta Lake and Azure services for automation and efficiency. Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python. Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: 3+ years of experience in data engineering or a related field. Strong experience in Alteryx workflows and data preparation/ETL processes. Strong proficiency in the Microsoft Azure data ecosystem including: o Azure Data Factory (ADF) o Azure Synapse Analytics o Microsoft Fabric o Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.

Posted 1 week ago

Apply

10.0 - 15.0 years

25 - 30 Lacs

kolkata, mumbai, new delhi

Work from Office

The Director of AI & Data will lead the strategic vision, delivery oversight, and governance of Artificial Intelligence (AI) and AI associated data initiatives across MUFG Pension and Market Services (MPMS). This role is responsible for embedding AI as a trusted, value-generating capability while ensuring responsible data use and compliance with regulatory, ethical, and organisational standards. Working in partnership with key vendors, business, technology, operations, legal, Information Security and risk teams, the Director will champion scalable AI-driven solutions that enhance operational efficiency, unlock personalised services, and support innovation. The role also leads the enterprise data agenda overseeing data governance, platform strategy, and advanced analytics maturity. Key Accountabilities and main responsibilities Strategic Focus Define and evolve MPMS s Group AI strategy, ensuring alignment with business goals, regulatory obligations, and ethical principles. Drive enterprise-wide prioritisation and execution of AI initiatives that deliver measurable value to operations, customer experience, and risk management. Establish a scalable and compliant AI operating model including demand intake, capability build, governance, risk management, and change enablement. Operational Excellence Oversee enterprise data management including data platform architecture, master data governance, and data lifecycle controls. Drive the development of advanced analytics, data science models, and business intelligence capabilities in alignment with strategic business needs. Partner with business leaders to translate AI and data opportunities into tangible commercial and service outcomes. Lead the enterprise data agenda, overseeing data governance, platform strategy, and advanced analytics maturity. People Leadership and Stakeholder Engagement Serve as a trusted advisor to the Executive Leadership Team (ELT) on AI and data matters. Foster a strong community of practice across technology, data, and business domains to build awareness, capability, and adoption of AI. Represent MPMS externally in regulatory, vendor, and industry forums related to AI and data innovation. Develop and retain key people, proactively identifying key people risks through the development of robust succession planning Lead and inspire teams, role-modelling MPMS values while contributing to the Business Enablement function Governance & Risk Management Lead the implementation of MPMS s AI Principles: transparency, fairness, trust, privacy, safety, explainability, accountability, and human oversight. Establish governance forums and control mechanisms to ensure AI and data use complies with global and local regulatory frameworks (e.g. APRA, GDPR, SEBI). Oversee AI risk assessments and model validation processes to ensure robustness and integrity of algorithms. The above list of key accountabilities is not an exhaustive list and may change from time-to-time based on business needs. Experience & Personal Attributes Over 10 years of experience in enterprise data, analytics, or AI leadership roles within financial services or regulated industries. Proven expertise in leading strategic initiatives, managing operations, and implementing governance frameworks Demonstrated track record of delivering scalable AI/ML solutions with business impact. Strong understanding of data governance frameworks, privacy law, and AI ethics. Familiarity with cloud-based data platforms (e.g., Azure, AWS) and modern data tooling. Experience managing cross-functional teams and influencing senior stakeholders. Demonstrated understanding of AI/ML technologies, data architectures, and modern analytics platforms. Demonstrated experience to set a long-term vision and roadmap for AI and data in line with business strategy. Exceptional ability to influence stakeholders, facilitate consensus, and align teams toward a common vision Effective communicator, able to tailor messaging to diverse audiences at all levels Proven capability to lead teams, influence executives, and drive organisational change. Demonstrated understanding of regulatory obligations and ethical considerations in AI and data use.

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

bengaluru

Work from Office

As the Director of Data Program Management at Honeywell, you will be accountable for overseeing and managing a portfolio of multiple data projects and initiatives across the organization. Your role will involve strategic planning, program management, and ensuring the successful delivery of data-driven projects. You will work closely with cross-functional teams to define program objectives, establish timelines, allocate resources, and monitor progress. You will report directly to the Vice President of Commercial Data and Analytics and work out of our Bangalore location on an afternoon/evening work schedule. In this role, you will have a significant impact on our organization by driving the successful execution of data programs and initiatives. You will work with stakeholders to define program requirements, develop program management strategies, and ensure alignment with business objectives. Your guidance and expertise in program management will be instrumental in driving the successful delivery of data-driven projects, enabling us to leverage data as a strategic asset and drive business growth. YOU MUST HAVE Minimum of 8 years of experience in program management or related roles. Proven track record of successfully managing and delivering complex data programs. Strong leadership and people management skills. Excellent strategic thinking and problem-solving abilities. Ability to influence and negotiate with stakeholders at all levels. Excellent communication and people-oriented skills. WE VALUE Bachelors degree in Business, Engineering, or a related field. Masters degree in Business, Engineering, or a related field (preferred). Experience in a global organization. Experience in guiding large-scale program management teams. Strong thoughtful and data-driven decision-making skills. Ability to adapt to a fast-paced and changing environment. Key Responsibilities Lead the planning, execution, and delivery of data programs and initiatives Collaborate with crossfunctional teams to define data strategies and roadmaps Establish and maintain data governance frameworks and policies Drive data quality and accessibility improvements through data management best practices Develop and implement data analytics and reporting capabilities Monitor and track key performance indicators to measure the effectiveness of data programs Provide regular updates and reports to senior management on the progress of data initiatives Lead and develop a highperforming team of data program managers

Posted 1 week ago

Apply

5.0 - 7.0 years

6 - 16 Lacs

hyderabad, pune

Hybrid

Job Summary: We are seeking a Data Steward to support enterprise data initiatives by ensuring the accuracy, completeness, and quality of data pipelines across data platforms. The ideal candidate will bring strong expertise in data quality practices, and data management principles to help establish trusted data foundations that drive business intelligence, analytics, and operational reporting. This role will be a critical part of our data team, to support data governance efforts, and enhance our data architecture. Key Responsibilities: Serve as business and technical steward for assigned data domains. Create & Maintain metadata, business glossary, and data catalog entries in DG Tools like Collibra, Altion, Informatica or similar. Work with data owners to enforce data definitions, classifications, and data quality rules. Perform data profiling, issue identification, and remediation coordination. Support the onboarding of new data assets (cataloging, lineage, and stewardship assignments). Assist in governance tool configuration, workflow management, and integration automation. Collaborate with engineering teams to support PII tagging, data masking, and access governance. Provide training and support to business users in using governance tools effectively. Deploy the master data governance framework and use the supporting data management tools. Define data quality metrics and monitor data quality performance against established targets. Conduct regular data governance reviews and provide recommendations for process improvements. Ensure data accuracy, consistency, and integrity Implement and monitor data quality rules, validation checks, and exception handling. Collaborate with teams to align data delivery with business and reporting requirements. Document data processes, standards, and lineage to support governance and compliance. Qualifications: 5+ years of experience in Data Governance, Operations of Data Governance Hands-on experience with Collibra (catalog, lineage, workflows, API integrations) . Experience with data management tools (e.g., data catalogs, MDM systems, data quality platforms). Strong understanding of data quality, metadata management, and data classification practices . Technical skills in SQL, Python/Java, REST APIs for tool integration and automation. Familiarity with Google cloud platform (GCP) will be a plus Excellent communication and ability to liaise between business users and technical teams .

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

bengaluru

Work from Office

Experience: 3+ Years of experience with Microsoft Purview Excellent communication and communication skills, with the ability to effectively liaise with both technical and non-technical stakeholders. Capable of generating accurate, comprehensive as-built documentation representing the total output of work delivered to the client. Strong ability to create a positive impression on clients and maintain confidence while guiding client IT teams in enterprise deployments of Purview. This includes navigating various client challenges, attitudes, concerns and expectations while achieving technical success. Strong analytical, problem-solving, and troubleshooting skills . Mastery in Microsoft Purview and its integration with other Microsoft services. Deep understanding of data security principles and best practices. Familiarity with global data privacy regulations such as GDPR, DPDPA, CCPA, and HIPAA. Strong understanding of Sensitive Infotypes, Custom and Trainable Classifiers, Records Management and Data Lifecycle Management concepts. Experience with Microsoft Purview, including data governance, compliance management, and Information Protection capabilities. Specific experience strongly recommended and will be thoroughly assessed in technical interviews. Experience in conducting risk assessments, audits, and data flow mapping in Microsoft Purview. Experience with incident response and threat detection related to data breaches. Experience in scripting and automation using PowerShell, Azure CLI, or similar technologies. Background in IT compliance, risk management, or related fields. Experience deploying other Microsoft 365 and Azure tools including Microsoft Defender, Microsoft Entra, Microsoft Intune, and more.

Posted 1 week ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

chennai

Work from Office

Hiring for Google Data Engineering - Chennai Job Description- We are looking for a skilled and motivated Data Engineer with 4+ years of experience, including strong hands-on expertise in the Google Cloud Platform (GCP) . As a Data Engineer, you will be responsible for designing, developing, and managing scalable data pipelines, transforming raw data into actionable insights, and supporting the organizations data infrastructure. Key Responsibilities Design and implement data pipelines using GCP services (e.g., BigQuery , Dataflow , Pub/Sub , Cloud Storage , Composer , Dataproc ). Build ETL/ELT pipelines to ingest data from structured and unstructured data sources. Optimize data workflows for performance, scalability, and reliability. Work closely with Data Analysts, Scientists, and other Engineers to ensure data quality and consistency. Implement data governance , security, and monitoring best practices. Automate workflows using Cloud Composer (Airflow) or similar orchestration tools. Develop CI/CD pipelines for deploying data solutions. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Troubleshoot and resolve data issues, ensuring data integrity and availability.

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Job Role: Scrum Master Lead Analyst Location: Bangalore, India Job Objective: We are looking for an experienced Scrum Master to join our International Health portfolio in Bangalore, India. Scrum Masters primary focus will be to create an environment where the teams can deliver high quality, valuable software with a one team approach. Scrum Master will typically focus on upholding the values of Scrum, facilitating meetings and discussions, and removing blockers, so that team can focus on product delivery. Serves as the servant leader of the Scrum team by removing impediments and helping the team remain successful and on schedule. Must have a sufficient understanding of technology to lead team members and help them overcome their development roadblocks. Scrum Master must also understand business strategy and objectives and be able to ensure that development work is prioritized by business value and results align with objectives. Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives. Uses deep professional knowledge and acumen to advise functional leaders. Recognized internally as a subject matter expert on Scrum. Essential Duties: Guiding the team and organization on how to use Agile/Scrum practices and values to delight customers. Cross collaborates with members of the team from various functions, example data scientist, data analyst, data governance, technology, engineering, quality analysts, product owners, etc. and bring them together as One Team to drive outcomes. Guiding the team on how to get the most out of self-organization. Assessing the Agile maturity of the team and coach the team to achieve higher levels of maturity, at a pace that is sustainable and comfortable for the team and organization. Removing impediments or guiding the team to remove impediments by finding the right personnel to remove the impediment. Building a trusting and safe environment where problems can be raised without fear of blame, or being judged, with an emphasis on problem solving, openness, honesty, and respect . Facilitating getting the work done without coercion, assigning, or dictating the work. Facilitating discussion, decision making, and conflict resolution. Assisting with internal and external communication, improving transparency, and radiating information. Supporting and educating the Product Owner, especially with respect to grooming and maintaining the product backlog. Providing all support to the team using a servant leadership style whenever possible and leading by example . Facilitating daily scrum, sprint planning, demo, retrospective, PI planning breakout and all essential SAFe events. Helping team identify their capacity, risks, and dependencies. Preferred Skills/Experience: Experience in working in Health Insurance domain. At least 4 years of experience in Scrum Master/Team Coach role. Any of the following certification: CSM (Certified Scrum Master) SSM (Certified SAFe Scrum Master) PSM (Professional Scrum Master) Second level Scrum Master Certification (CSP, PSM II) Knowledge of agile frameworks: Scrum, SAFe, Kanban. Knowledge and/or experience with widely successful Agile techniques: User Story creation including Acceptance Criteria, estimation, DoD, DoR, TDD, Continuous Integration, Continuous Deployment, Pair Programming, Automated Testing, Agile Games. Experience applying a wide variety of well-documented patterns and techniques, example: Burndown techniques, Retrospective formats, handling bugs, etc. Excellent communication and mentoring skills. Hands-on experience with tools like Jira, Confluence, SharePoint, Mural. Ability to prepare and track team dashboards and plans in Jira.

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

gurugram

Work from Office

To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What Youll Do Join the Data Layer Team , a global portfolio transforming our organization into a data-driven enterprise! The Data Layer Team is a portfolio of 30 people who build essential data platforms, products, and capabilities to empower our clients and colleagues with high-quality, actionable insights. Our focus is on creating scalable data solutions and advancing our data infrastructure to drive informed decision-making across the company. As Use Case Enablement Product Analyst within BCG s Data Layer Team, you will collaborate with Use Case Enablement Product Owner and cross-functional teams to gather and analyze business and data requirements. Your role is critical to bridging the gap between business stakeholders and technical teams, ensuring that new GenAI use cases are well-scoped, feasible, and aligned with user needs. You will work with various Gen AI use cases and applications, including: Consultant Journey internal GenAI assistants that changes the way consultants work to provide value to our clients. Practice Area GenAI applications developed by functional practice areas to support various capabilities (e.g., outside-in rapid cost diagnostics or Accelerated Cost Analysis). Data catalog a centralized library that provides consulting teams with access to critical tools and data assets across BCG These tools require ingestion of multiple data sources, and your role will be to support the selection of eligible datasets and identify the best sources for each GenAI use case. You will ensure that these use cases and applications are equipped with the necessary data pipelines to maximize their impact on business and users. You will play a key role in use case discovery and requirements refinement, while also managing the continuous maintenance and enhancement of data asset quality, accuracy, and stability to support evolving use cases. Detailed responsibilities include: Deliver business results and customer value Support the development of GenAI-enabled data products by helping translate business needs into actionable data requirements Help to define requirements for user stories and structure the backlog with a focus on measurable outcomes Help shape GenAI-enabled use cases that contribute to real business impact, through thoughtful prioritization and attention to detail Participate in evaluating use case success metrics and learn from what works (and what doesn t) Serve as the voice of customer or end-user Translate business needs into user stories, engaging end users for continuous feedback Engaging in continuous data discovery exercises to understand most valuable data assets that satisfy customer needs Balance customer value, technical feasibility, and business impact when making prioritization decisions Work with product teams to integrate GenAI-enhanced offerings into BCG systems and workflows Deliver high-quality outcomes Collaborate with engineers, architects, and product teammates to test and validate data pipelines ensuring solutions are robust, accurate, and useful Contribute to documentation that helps others understand the why and how behind what s been built, supporting long-term scalability and reuse Work with stakeholders across BCG (e.g., Practice Areas, Knowledge Teams) to ensure data products are grounded in real needs and enable meaningful use Share observations, risks, or open questions early your input helps the team avoid missteps and refine solutions before they reach users What Youll Bring 4 6+ years of experience in a product analyst, business analyst, or data analyst role ideally supporting data or AI-related projects Project management skills, with ability to build project plans, track progress, drive alignment and manage risks Proven experience in AI, GenAI, or data product development, preferably with a focus on GenAI-powered user-facing applications Experience in enterprise software development, data engineering, or AI-driven transformation initiatives Experience working with structured and unstructured data, and familiarity with modern data platforms (e.g., Snowflake, AWS, SharePoint) A working knowledge of agile ways of working, and openness to learning through iteration and feedback Understanding of enterprise data governance, AI model integration, and scalable data architecture Familiarity with AI/ML technologies, including GenAI models (e.g., OpenAI GPT, RAG, fine tuning models, or machine learning frameworks) Good communication skills, especially when collaborating across different functions or surfacing potential risks or questions Familiarity with tools like JIRA, Confluence, Excel, or lightweight data catalog platforms is a plus Experience in a consulting or client-service environment is helpful, but not required Who Youll Work With BCG Global Consulting Practice Areas (Functional & Industry) and Data Teams Partnering with business leaders to transform prioritized offerings into GenAI-enabled solutions, collaborating with teams such as the Data Product Portfolio, Data Governance CoE, Master Data Management, Enterprise Architecture, and Data Product Development Data Layer Offer Enablement Product Owner Lead (PO) Aligning on strategic priorities, roadmap development, and execution Data Layer Offer Enablement Team Collaborating amongst Data Product Analyst, and working alongside data engineers, lead architects, data stewards, and QA engineers Data Layer Data Governance Team Partnering to ensure that data assets meet quality, metadata, and compliance standards, and are appropriately catalogued for reuse Product Teams Collaborating with BCG s product team members to integrate required data sources into GenAI-enhanced offerings Agile Coaches Embedding agile principles into daily work, leveraging coaching support to drive an iterative and user-focused approach to GenAI use case development Data Product Consumers (Internal Customers) Translating their voice and needs into user stories, ensuring their requirements are reflected in the backlog, and actively engaging them for feedback and validation Additional info YOU RE GOOD AT Being user-focused Deeply understanding and translating business needs into GenAI enabled solutions, ensuring offerings address real user challenges Communicating with transparency Clearly and openly engaging with stakeholders at all levels, ensuring alignment, visibility, and trust across teams Bringing a data-driven approach to decision-making Leveraging qualitative and quantitative insights to prioritize initiatives, measure impact, and refine solutions Facilitating data discovery sessions Engaging business stakeholders to capture business context, user intent, and data solution objectives Breaking down complex challenges Applying critical reasoning and creative problem solving to analyze problem statements and design effective, scalable solutions Collaborating with product and technical teams Working closely with POs, engineers, and data stewards to ensure solutions meet expectations and constraints Collaborating with development teams Ensuring prioritized data sources align with GenAI solution requirements, business objectives, and technical feasibility Defining and tracking KPIs Establishing measurable success metrics to drive squad performance and ensure data products align with OKRs Documenting thoughtfull y Creating simple, clear artifacts (e.g., data definitions, flow diagrams, test plans) that others can build from Contributing to continuous improvement Bringing curiosity and a mindset of learning, always looking for ways to improve how the team works or deliver ",

Posted 1 week ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

pune

Work from Office

KPMG India is looking for Manager - Data Governance to join our dynamic team and embark on a rewarding career journey. Develop and implement data governance frameworks and policies. Lead data governance initiatives and ensure compliance with regulatory requirements. Collaborate with data owners and stakeholders to establish data stewardship programs. Monitor and maintain data quality across the organization. Conduct data governance training and awareness sessions. Identify and address data-related issues and risks.

Posted 1 week ago

Apply

9.0 - 14.0 years

35 - 45 Lacs

bengaluru

Work from Office

Join our Team About this opportunity: Join Ericsson as a Data Scientist. This position plays a crucial role in the development of Python-based solutions, their deployment within a Kubernetes-based environment, and ensuring the smooth data flow for our machine learning and data science initiatives. The ideal candidate will possess a strong foundation in Python programming, hands-on experience with ElasticSearch, Logstash, and Kibana (ELK), a solid grasp of fundamental Spark concepts, and familiarity with visualization tools such as Grafana and Kibana. Furthermore, a background in ML Ops and expertise in both machine learning model development and deployment will be highly advantageous. What you will do: Python Development: Write clean, efficient and maintainable Python code to support data engineering tasks including collection, transformation and integration with ML models. Data Pipeline Development: Design, build and maintain robust data pipelines to gather, process and transform data from multiple sources into formats suitable for ML and analytics, leveraging ELK, Python and other leading technologies. Spark Knowledge: Apply core Spark concepts for distributed data processing where required, and optimize workflows for performance and scalability. ELK Integration: Implement ElasticSearch, Logstash and Kibana for data ingestion, indexing, search and real-time visualization. Knowledge of OpenSearch and related tooling is beneficial. Dashboards and Visualization: Create and manage Grafana and Kibana dashboards to deliver real-time insights into application and data performance. Model Deployment and Monitoring: Deploy machine learning models and implement monitoring solutions to track model performance, drift, and health. Data Quality and Governance: Implement data quality checks and data governance practices to ensure data accuracy, consistency, and compliance with data privacy regulations. MLOps (Added Advantage): Contribute to the implementation of MLOps practices, including model deployment, monitoring, and automation of machine learning workflows. Documentation: Maintain clear and comprehensive documentation for data engineering processes, ELK configurations, machine learning models, visualizations, and deployments. The skills you bring: Core Skills: Strong Python programming skills, experience building data pipelines, and knowledge of ELK stack (ElasticSearch, Logstash, Kibana). Distributed Processing: Familiarity with Spark fundamentals and when to leverage distributed processing for large datasets. Cloud & Containerization: Practical experience deploying applications and services on Kubernetes. Familiarity with Docker and container best practices. Monitoring & Visualization: Hands-on experience creating dashboards and alerts with Grafana and Kibana. ML & MLOps: Experience collaborating on ML model development, and deploying and monitoring ML models in production; knowledge of model monitoring, drift detection and CI/CD for ML is a plus. Experience criteria is 9 to 14years Why join Ericsson What happens once you apply Primary country and city: India (IN) || Bangalore Req ID: 772044

Posted 1 week ago

Apply

3.0 - 6.0 years

25 - 30 Lacs

kolkata, mumbai, new delhi

Work from Office

Strong understanding of AWS services and cloud-native architectures. Familiarity with CI/CD pipelines and scripting (Python, Shell, Groovy). Hands-on experience with Terraform for infrastructure automation. Expertise in any one of the automation tools like Ansible, puppet, chef etc. Knowledge of Docker, Kubernetes, and CI/CD tools (Jenkins, ArgoCD, GitHub Actions, GitLab CI/CD). Understanding of Crossplane. Strong troubleshooting and performance optimisation skills. Knowledge of data governance, security, and compliance standards. Excellent collaboration and communication skills. Ensure solutions are designed and developed using a scalable, highly resilient cloud native architecture. Collaborate with and consult other development teams, product and architecture teams. Experience & Skills Fitment: 3-6 years of experience in DevOps field. Experience in any one of the BI Technologies like IBM Cognos, Planning Analytics or Tableau is an added advantage. Good to have an experience integrating applications with observability tools like splunk, new relic etc.

Posted 1 week ago

Apply

3.0 - 10.0 years

13 - 16 Lacs

mumbai

Work from Office

BU Subject Matter Expert in Demand Planning and IBP/S&OE to facilitate process design with Key Users Lead the BU transformation in way of working for future demand planning vision - unified view of demand and low touch. Providing stakeholder analysis and change impact assessment, and drive the change with x-functional partners and key users. Solve problems with consideration of sustainable and standardized solution representing not only BU but also AMEA Identify and validate Master data where required, support in data governance establishment in BU and support on driver data owner identification Responsible for planning functional build & implementation/Working in testing and processes validation Document Functional requirements / participate in fit gap analyses with regional functional stream lead Create functional/integration test scenarios during SIT with regional SME lead support Test/validate unit and integrated processes, based on the test scenarios Conduct the BU End Users trainings and drive change management and adoption governance Deep training on the o9 platform in preparation for functional design documentation Participate various workshops, committees and cross-functional design session

Posted 1 week ago

Apply

0.0 - 4.0 years

8 - 9 Lacs

mumbai

Work from Office

Stakeholder Involvement Act as the primary point of contact for data governance related inquiries from business functions and senior management. Build and maintain strong relationships with stakeholders across various departments to ensure alignment and support for data governance initiatives. Support regular meetings with stakeholders to discuss data governance issues, requirements and updates. Address stakeholder concerns and provide guidance on data management policy, standard and operating model. Communication Carry out a comprehensive communication strategy to promote data governance awareness and compliance. Create and deliver presentations and reports to senior management and business functions regarding data governance activities and progress. Prepare and distribute regular updates and newsletters on data governance initiatives, policies and best practices. Translate complex data governance concepts into clear, actionable information for nontechnical stakeholders. Data Governance Support the development, implementation and maintenance of data governance policies, standards and procedures. Assist in the identification and resolution of data governance issues, ensuring compliance with regulatory requirements and industry best practices. Come up with an idea of improving data quality and support the implementation and maintenance of the framework. Collaborate with the data management roles in business and technology functions to ensure data management best practices are embedded into business processes and systems. Monitor and report on the effectiveness of data governance initiatives and provide recommendations for improvement. Training and Education Support training programs and workshops to educate business functions on data governance framework, tooling and best practices. Promote a culture of data governance awareness and accountability across the organization

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

hyderabad, pune, bengaluru

Work from Office

KPI Partners is seeking a skilled Databricks Specialist to join our team. The ideal candidate will possess a strong background in data engineering, analytics, and machine learning with substantial experience in the Databricks platform. Key Responsibilities: - Design, develop, and maintain scalable data pipelines and ETL processes on Databricks. - Collaborate with data scientists and analysts to support analytics initiatives using Databricks and Apache Spark. - Optimize data engineering workflows for performance and cost efficiency. - Monitor and troubleshoot data processing jobs and workflows to ensure high availability and reliability. - Implement and maintain data governance and security measures on Databricks. - Provide technical guidance and support to team members on Databricks best practices and performance tuning. - Stay updated with the latest trends in data engineering, big data, and cloud technologies. Qualifications: - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - Proven experience in working with Databricks, Apache Spark, and big data technologies. - Strong programming skills in languages such as Python, Scala, or SQL. - Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. - Experience with data visualization tools and frameworks. - Excellent problem-solving skills and the ability to work independently as well as part of a team. Preferred Qualifications: - Databricks certification or relevant big data certifications. - Experience with machine learning libraries and frameworks. - Knowledge of data warehousing solutions and methodologies. If you are passionate about data and possess a deep understanding of Databricks and its capabilities, we encourage you to apply for this exciting opportunity with KPI Partners. Join us in our mission to leverage data for impactful decision-making.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies