Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
As a Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 3+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience e
Posted 2 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Hyderabad
Work from Office
As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience e
Posted 2 weeks ago
15.0 - 19.0 years
40 - 45 Lacs
Pune
Work from Office
Skill Name - Data Architect with Azure & Databricks + Power BIExperience: 15 - 19 years Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.
Posted 2 weeks ago
2.0 - 6.0 years
7 - 17 Lacs
Hyderabad
Work from Office
In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Excellent verbal, written, and interpersonal communication skills. Strong knowledge of Enterprise Risk programs and applicability of risk management framework (3 Line of defense) Experience identifying internal and external data sources from multiple sources across the business Experience with SQL, Teradata, or SAS and Database Management systems like Teradata and MS SQL Server. Experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience in data visualization and business intelligence tools. Advanced Microsoft Office (Word, Excel, Outlook and PowerPoint) skills Demonstrated strong analytical skills with high attention to detail and accuracy. Strong presentation skills and ability to translate and present data in a manner that educates, enhances understanding, and influence decisions, bias for simplicity Strong writing skills - proven ability to translate data sets and conclusions drawn from analysis into business/executive format and language Ability to support multiple projects with tight timelines Meta Data management, Data Lineage, Data Element Mapping, Data Documentation experience. Experience researching and resolving data problems and working with technology teams on remediation of data issues Hands-on proficiency with Python, Power BI (Power Query, DAX, Power apps), Tableau, or SAS Knowledge of Defect management tools like HP ALM. Knowledge of Data Governance. Job Expectations: Ensure adherence to data management or data governance regulations and policies Extract and analyze data from multiple technology systems/platforms and related data sources to identify factors that pose a risk to the firm. Consult with business line and enterprise functions on less complex research Understand compliance and risk management requirements for sanctions compliance and data management Perform analysis of findings and trends using statistical analysis and document process Require a solid background in reporting, understanding and utilizing Relational Databases and Data Warehouses, and be effective in querying and reporting large and complex data sets. Excel at telling stories with data, presenting information in visually compelling ways that appeal to executive audiences, and will be well versed in the development and delivery of reporting solutions. Responsible for building easy to use visualization and perform data analysis to generate meaningful business insights using complex datasets for global stakeholders. Responsible for testing key reports and produce process documentation. Present recommendations to maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff
Posted 2 weeks ago
5.0 - 9.0 years
25 - 40 Lacs
Bengaluru
Remote
Key Responsibilities: Analyze end-to-end business processes using Celonis Process Mining to identify inefficiencies, root causes, and data quality issues. Collaborate with Data Stewards, IT, and business units to ensure data governance policies are aligned with process insights. Develop and maintain dashboards and reports in Celonis to track key performance indicators (KPIs), data lineage, and governance metrics. Work with cross-functional teams to define and implement data governance controls and remediation strategies based on process analytics. Support the development of data dictionaries, metadata management, and data cataloging in alignment with enterprise data standards. Assist in the rollout of enterprise-wide data governance programs and compliance initiatives (e.g., GDPR, CCPA). Continuously monitor and assess data quality metrics and suggest corrective actions to improve data accuracy and reliability. Provide training and documentation on Celonis best practices and data governance processes to business stakeholders. Required Qualifications: Bachelors degree in Information Systems, Data Analytics, Business Administration, or a related field. 5+ years of experience in data governance, business process analysis, or data analytics roles. 2+ years of hands-on experience with Celonis EMS (Execution Management System) or comparable process mining tools. Strong understanding of data governance frameworks, data quality principles, and data lifecycle management. Proficient in SQL and working knowledge of data visualization tools (e.g., Power BI, Tableau). Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication and stakeholder engagement skills across technical and non-technical teams.
Posted 2 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Chennai
Work from Office
Overview The Senior Data Science Engineer will leverage advanced data science techniques to solve complex business problems, guide decision-making processes, and mentor junior team members. This role requires a combination of technical expertise in data analysis, machine learning, and project management skills. Responsibilities Data Analysis and Modeling Analyze large-scale telecom datasets to extract actionable insights and build predictive models for network optimization and customer retention. Conduct statistical analyses to validate models and ensure their effectiveness. Machine Learning Development Design and implement machine learning algorithms for fraud detection, churn prediction, and network failure analysis. Telecom-Specific Analytics Apply domain knowledge to improve customer experience by analyzing usage patterns, optimizing services, and predicting customer lifetime value. ETL Processes Develop robust pipelines for extracting, transforming, and loading telecom data from diverse sources. Collaboration Work closely with data scientists, software engineers, and telecom experts to deploy solutions that enhance operational efficiency. Data Governance : Ensure data integrity, privacy, security and compliance with industry standards Advanced degree in Data Science, Statistics, Computer Science, or a related field. Extensive experience in data science roles with a strong focus on machine learning and statistical modeling. Proficiency in programming languages such as Python or R and strong SQL skills. Familiarity with big data technologies (e.g., Hadoop, Spark) is advantageous. Expertise in cloud platforms such as AWS or Azure.
Posted 2 weeks ago
6.0 - 10.0 years
13 - 17 Lacs
Chennai
Work from Office
Overview Prodapt is looking for a Data Model Architect. The candidate should be good with design and data architecture in Telecom domain. Responsibilities Deliverables Design & Document – Data Model for CMDB, P-S-R Catalogue (Product, Service and Resource management layer) Design & Document Build Interface Speciation for Data Integration. Activities Data Architecture and Modeling Design and maintain conceptual, logical, and physical data models Ensure scalability and adaptability of data models for future organizational needs. Data Model P-S-R catalogs in the existing Catalogs,SOM,COM systems CMDB Design and Management Architect and optimize the CMDB to accurately reflect infrastructure components, telecom assets, and their relationships. Define data governance standards and enforce data consistency across the CMDB. Design data integrations between across systems (e.g., OSS/BSS, network monitoring tools, billing systems). Good Communication skills. Bachelors Degree.
Posted 2 weeks ago
12.0 - 17.0 years
20 - 25 Lacs
Noida
Work from Office
Position Summary Overall 12+ years of quality engineering experience with DWH/ETL for enterprise grade applications Hands on experience with functional, non-functional and automation of products Hands on experience with leverage LLMs/GenAI for improving efficiency & effectiveness of overall delivery process Job Responsibilities Leading end-to-end QE for product suite Authoring QE test strategy for a release and executing it for a release Driving quality releases by closely working with development, PMs, DevOps, support and business teams Achieving automation coverage for product suite with good line coverage Manage risks and resolves issues that affect release scope, schedule and quality Work with product teams to understand impacts of branches and code merges, etc. Lead and co-ordinate the release activities including the execution of overall Ability to lead team of SDETs and help them in addressing their issues Mentoring and coaching members in the team Education BE/B.Tech Master of Computer Application Work Experience Overall 12+ years of strong hands-on experience with DWH/ETL for enterprise grade applications Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Lifescience Knowledge AWS Data Pipeline Azure Data Factory Data Governance Data Modelling Data Privacy Data Security Data Validation Testing Tools Data Visualisation Databricks Snowflake Amazon Redshift MS SQL Server Performance Testing
Posted 2 weeks ago
1.0 - 6.0 years
3 - 5 Lacs
Hyderabad
Work from Office
Role Description: We are seeking an MDM Associate Analyst with 2 5 years of development experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong experience on MDM (Master Data Management) on configuration (L3 Configuration, Assets creati on, Data modeling etc ) , ETL and data mappings (CAI, CDI ) , data mastering (Match/Merge and Survivorship rules) , source and target integrations ( RestAPI , Batch integration, Integration with Databricks tables etc ) Roles & Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong experience with Informatica or Reltio MDM platforms in building configurations from scratch (Like L3 configuration or Data modeling, Assets creations, Setting up API integrations, Orchestration) Strong experience in building data mappings, data profiling, creating and implementation business rules for data quality and data transformation Strong experience in implementing match and merge rules and survivorship of golden records Expertise in integrating master data records with downstream systems Very good understanding of DWH basics and good knowledge on data modeling Experience with IDQ, data modeling and approval workflow/DCR. Advanced SQL expertise and data wrangling. Exposure to Python and PySpark for data transformation workflows. Knowledge of MDM, data governance, stewardship, and profiling practices. Good-to-Have Skills: Familiarity with Databricks and AWS architecture. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Basics of data engineering concepts. Professional Certifications : Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Role Description: We are looking for a skilled MDM Testing Associate Analyst who will responsible for ensuring the quality and integrity of Master Data Management (MDM) applications through rigorous testing processes. This role involves collaborating with cross-functional teams to define testing objectives, scope, and deliverables, and to ensure that master data is accurate, consistent, and reliable. and comply with Amgens standard operating procedures, policies, and guidelines. Your expertise will be instrumental in ensuring quality and adherence to required standards so that the engineering teams can build and deploy products that are compliant. Roles & Responsibilities: Test Planning: Develop and implement comprehensive testing strategies for MDM applications, including defining test objectives, scope, and deliverables. This includes creating detailed test plans, test cases, and test scripts. Test Execution: Execute test cases, report defects, and ensure that all issues are resolved before deployment. This involves performing functional, integration, regression, and performance testing. Data Analysis: Analyze data to identify trends, patterns, and insights that can be used to improve business processes and decision-making. This includes validating data accuracy, completeness, and consistency. Collaboration: Work closely with the MDM, RefData and DQDG team and other departments to ensure that the organizations data needs are met. This includes coordinating with data stewards, data architects, and business analysts. Documentation: Maintain detailed documentation of test cases, test results, and any issues encountered during testing. This includes creating test summary reports and defect logs. Quality Assurance: Develop and implement data quality metrics to ensure the accuracy and consistency of master data. This includes conducting regular data audits and implementing data cleansing processes. Compliance: Ensure that all master data is compliant with data privacy and protection regulations. This includes adhering to industry standards and best practices for data management. Training and Support: Provide training and support to end-users to ensure proper use of MDM systems. This includes creating user manuals and conducting training sessions Stay current on new technologies, validation trends, and industry best practices to improve validation efficiencies. Collaborate and communicate effectively with the product teams. Basic Qualifications and Experience: Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: 2+ years of experience in MDM implementations, primarily with testing (pharmaceutical, biotech, medical devices, etc.) Extensive experience on ETL/ELT and MDM testing (Creating test plan, test scripts and execution of test scripts and bugs tracking/reporting in JIRA) Informatica MDM: Proficiency in Informatica MDM Hub console, configuration, IDD (Informatica Data Director), IDQ, and data modeling or Reltio MDM: Experience with Reltio components, including data modeling, integration, validation, cleansing, and unification. Advanced SQL: Ability to write and optimize complex SQL queries, including subqueries, joins, and window functions. Data Manipulation: Skills in data transformation techniques like pivoting and unpivoting. Stored Procedures and Triggers: Proficiency in creating and managing stored procedures and triggers for automation. Python: Strong skills in using Python for data analysis, including libraries like Pandas and NumPy etc. Automation: Experience in automating tasks using Python scripts. Machine Learning: Basic understanding of machine learning concepts and libraries like scikit-learn. Strong problem-solving and analytical skills Excellent communication and teamwork skills Good-to-Have Skills: ETL Processes: Knowledge of ETL processes for extracting, transforming, and loading data from various sources. Data Quality Management: Skills in data profiling and cleansing using tools like Informatica. Data Governance: Understanding of data governance frameworks and implementation. Data Stewardship: Ability to work with data stewards to enforce data policies and standards. Selenium: Experience with Selenium for automated testing of web applications. JIRA: Familiarity with JIRA for issue tracking and test case management. Postman: Skills in using Postman for API testing. Understanding of compliance and regulatory considerations in master data. In depth knowledge of GDPR and HIPPA guidelines. Professional Certifications : MDM certification (Informatica or Reltio) SQL Certified Agile or SAFe certified Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
8.0 - 12.0 years
25 - 40 Lacs
Bengaluru
Work from Office
Job Summary We are seeking a highly skilled and experienced Senior Business Data Analyst to join our Entitlement and Install Base Master team. You will play a crucial role in driving the Install Base (IB) data strategy and vision. Your deep understanding of Install Base and Renewals business processes will be instrumental in ensuring accurate and efficient management of our Install Base Data. Job Requirements Drive the Install Base data strategy and vision, collaborating with cross-functional teams to define and implement data management processes and standards. Develop a comprehensive understanding of the Install Base and Renewals business, including key metrics, processes, and customer lifecycles. Drive Enterprise projects ensuring alignment with organizational goals and objectives. Collaborate with stakeholders to gather requirements and translate business needs into technical solutions for Install Base data management. Collaborate with cross-functional teams to define and implement data governance policies and procedures. Perform in-depth data analysis and validation to identify trends, patterns, and insights that drive business decision-making. Collaborate with IT teams to enhance data systems and tools supporting Install Base data management, ensuring data quality and accessibility. Provide guidance and support to cross-functional teams on Install Base data-related matters, acting as a subject matter expert. Identify opportunities for process improvements and automation to streamline Install Base data management and enhance operational efficiency. Stay up-to-date with industry trends and best practices in Install Base and Renewals business processes and data management. Coach and mentor team members to foster their professional growth and ensure smooth operations, promoting a collaborative and high-performing environment. Education 8+ years of experience as a Business Analyst, with a strong focus on Install Base and Renewals business processes. Proven track record of driving data strategy and vision for Install Base. Expertise in SQL querying and experience in working with complex relational databases. Proficiency in data analysis and manipulation techniques, including data cleansing, transformation, and validation. In-depth knowledge of Install Base and Renewals business processes, including customer lifecycles, product entitlements, and renewals workflows. Strong problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills, with the ability to work effectively with stakeholders at all levels. Detail-oriented with a focus on data accuracy and quality. Self-motivated and able to work independently with minimal supervision
Posted 2 weeks ago
4.0 - 9.0 years
14 - 17 Lacs
Pune, Gurugram
Work from Office
Define, implement, & enforce data governance policies & standards to ensure data quality, consistency, & compliance across the organization Collaborate with data stewards, business users, & IT teams to maintain metadata, lineage, & data catalog tools
Posted 2 weeks ago
5.0 - 9.0 years
7 - 17 Lacs
Pune
Work from Office
Job Overview: Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers Experience and Skills Required: 5 to12 years of experience in data architecture, engineering, or analytics roles Hands-on expertise in Databricks , especially Azure Databricks Proficient in Snowflake , with working knowledge of DBT, Airflow, and GitHub Experience with Google BigQuery and cloud-native data processing workflows Strong knowledge of modern data architecture, data lakes, warehousing, and ETL pipelines Excellent problem-solving, communication, and analytical skills Nice to Have: Certifications in Azure, Snowflake, or GCP Experience with containerization (Docker/Kubernetes) Exposure to real-time data streaming and event-driven architecture Why Join Diacto Technologies? Collaborate with experienced data professionals and work on high-impact projects Exposure to a variety of industries and enterprise data ecosystems Competitive compensation, learning opportunities, and an innovation-driven culture Work from our collaborative office space in Baner, Pune How to Apply: Option 1 (Preferred) Copy and paste the following link on your browser and submit your application for the automated interview process : - https://app.candidhr.ai/app/candidate/gAAAAABoRrTQoMsfqaoNwTxsE_qwWYcpcRyYJk7NzSUmO3LKb6rM-8FcU58CUPYQKc65n66feHor-TGdCEfyouj0NmKdgYcNbA==/ Option 2 1. Please visit our website's career section at https://www.diacto.com/careers/ 2. Scroll down to the " Who are we looking for ?" section 3. Find the listing for " Data Architect (Data Bricks) " and 4. Proceed with the virtual interview by clicking on " Apply Now ."
Posted 2 weeks ago
5.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Hybrid
About the Company Greetings from Teamware Solutions a division of Quantum Leap Consulting Pvt. Ltd About the Role We are hiring a Data Architecture Location: Bangalore Work Model: Hybrid Experience: 5-9 Years Notice Period: Immediate to 15 Days Job Description: Data Architecture, Data Governance, Data Modeling Additional Information: Mandatory Skills: Data Architecture, Data Governance, Data Modeling Nice to have skills Certification in Data Engineering Interview Mode Virtual Interview minimum 5 yrs relevant experience and maximum 9 yrs for this requirement. Someone with more experience in building PySpark data streaming jobs on Azure Databricks who have done real projects, have expertise, and hands-on experience also Also, Data governance and data modeling experience with a minimum of 4 years is mandatory Communication should be excellent Please let me know if you are interested in this position and send me your resumes to netra.s@twsol.com
Posted 2 weeks ago
4.0 - 6.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Role Description: The Manager Global Procurement Master Data/Data Governance will help lead Global Procurement on our data transformation journey to enhance our reporting & analytics while maintaining data quality and governance best practices. You will work across Global Procurement and the broader organization to execute data strategy and establish better governance of the groups procurement master data throughout its lifecycle and ensure that high quality, actionable data is provided to the business. Roles & Responsibilities: Master Data/ Data Governance Develop and secure implementation of the Master Data Governance across Global Procurement Develop and continuously improve lifecycle processes for efficient master data in across Source to Pay. Ensure effective master data governance disciplines across Global Procurement to establish data quality rules and ensure effective data quality program. Maintain and update governance documents, ensure alignment across data owners and center of excellence forum. Provide governance subject matter expertise across the integration process for projects and to ensure data is being governed properly. Facilitate forums to communicate governance practices and principles to improve data literacy and compliance. Serve as a data-driven change agent and advocate, persuading and influencing the organization at the grassroots level on the importance of governance. Continuous Improvement & Compliance: Support continuous improvement efforts to ensure processes minimize manual effort associated with collecting and refreshing data Ensure that data and analytics practices within procurement adhere to relevant regulatory requirements Key Stakeholder Management Build strong working relationships within Global Procurement and other functions to develop a shared understanding of Master Data Management requirements, business processes and functional inter-dependencies. Closely collaborate with the Data & Analytics team to understand the Analytics roadmap and uses cases Manage interconnection between data, analytics, process, and technology to enable Digital roadmap Functional Skills: Must-Have Skills: Familiarity with industry best practices for data governance Knowledge and understanding of driving organization change for data governance Experienced in establishing and enforcing standards, processes, policies and procedures to support data quality and management Demonstrated ability to understand complex information product constructs, databases and data structures Highly motivated self-starter with the ability to flexibly multi-task between strategy-related work and tactical in the trenches work. Ability to produce measurable business results while working under time constraints. Procurement expertise with experience in a broad range of systems and data platforms: SAP, SAP S4 Hana, SAP Master Data Management, Tableau, Power BI, Alteryx, Celonis, etc... Good-to-Have Skills: Prior experience in managing teams Soft Skills: Strong verbal and written communication skills Ability to work effectively with global, virtual teams Ability to navigate ambiguity High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Influencing and Change Management skills Basic Qualifications: B.S./B.A. Minimum of 4-6 years of relevant business experience Minimum of 2-3 years of multi-disciplined procurement experience Minimum of 1 years Pharmaceutical Procurement experience .
Posted 2 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad
Work from Office
What you will do In this vital role, you will be responsible for the end-to-end development of an enterprise analytics and data mastering solution using Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and impactful enterprise solutions that research cohort-building and advanced research pipeline. The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions, and be extraordinarily skilled with data analysis and profiling. You will collaborate closely with key customers, product team members, and related IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a good background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with key customers to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings. These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications. Basic Qualifications: Masters degree and 1 to 3 years of Data Engineering experience OR Bachelors degree and 3 to 5 years of Data Engineering experience OR Diploma and 7 to 9 years of Data Engineering experience Must Have Skills: Minimum of 3 years of hands-on experience with BI solutions (Preferable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 3 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design, DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms (AWS), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Good communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good to Have Skills: ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft Certified: Data Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity The highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, remote teams, specifically including using of tools and artifacts to assure clear and efficient collaboration across time zones Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources.
Posted 2 weeks ago
1.0 - 6.0 years
3 - 8 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking an MDM Associate Analyst with 2 5 years of development experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong experience on MDM (Master Data Management) on configuration (L3 Configuration, Assets creati on, Data modeling etc ) , ETL and data mappings (CAI, CDI ) , data mastering (Match/Merge and Survivorship rules) , source and target integrations ( RestAPI , Batch integration, Integration with Databricks tables etc ) Roles & Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong experience with Informatica or Reltio MDM platforms in building configurations from scratch (Like L3 configuration or Data modeling, Assets creations, Setting up API integrations, Orchestration) Strong experience in building data mappings, data profiling, creating and implementation business rules for data quality and data transformation Strong experience in implementing match and merge rules and survivorship of golden records Expertise in integrating master data records with downstream systems Very good understanding of DWH basics and good knowledge on data modeling Experience with IDQ, data modeling and approval workflow/DCR. Advanced SQL expertise and data wrangling. Exposure to Python and PySpark for data transformation workflows. Knowledge of MDM, data governance, stewardship, and profiling practices. Good-to-Have Skills: Familiarity with Databricks and AWS architecture. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Basics of data engineering concepts. Professional Certifications : Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams.
Posted 2 weeks ago
7.0 - 8.0 years
9 - 10 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will take ownership of Transportation Master data processes, ensuring the accuracy, consistency, and governance of critical data across the organization. This role will lead data validation, cleansing, and enrichment efforts, and collaborate with cross-functional teams to resolve complex data issues and drive process improvements. The Transportation Master Team Lead will also oversee key performance metrics, ensure compliance with data governance standards, and lead data migration and integration initiatives. Roles & Responsibilities: Lead and manage day-to-day MDM operations, including data validation, cleansing, and enrichment processes Oversee data governance practices, ensuring compliance with internal standards and regulatory requirements Collaborate with cross-functional teams, including IT and business units, to resolve complex data issues and improve data workflows. Implement and drive continuous improvements in MDM processes to enhance data accuracy, quality, and operational efficiency. Lead data migration, integration projects, and system upgrades to ensure seamless data consistency across platforms. Monitor and report on key performance indicators related to master data quality and operational success. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Bachelors in a STEM discipline and 7-8 years of experience in enterprise applications like SAP, and Oracle, proven experience in Transportation Master data management and data governance Industry experience preferably in healthcare or biotech supply chain Proficient in MS Office, visualization tools like Spotfire, Tableau, Power BI Strong analytical skills with the ability to collaborate cross-functionally and resolve data issues. Experience in leading and managing successful teams Preferred Qualifications: Must-Have Skills: Expertise in master data management processes and data governance Solid understanding of SAP ECC and proven experience in data implementation/integration projects Master data knowledge in the Transportation domain, other domains such as Material, Customer, and Production Master are a plus. Ability to lead and collaborate with cross-functional teams to set data strategy for a master domain(s) and drive prioritization across different projects and day-to-day operations Strong understanding of data governance frameworks and regulatory compliance standards and regulations (e.g., GDPR, HIPAA, GxP). Excellent problem-solving and analytical skills, with a focus on driving continuous improvement in data accuracy and quality Good-to-Have Skills: SAP S/4, SAP MDG, SAP TM Professional Certifications (please mention if the certification is preferred or mandatory for the role): Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.
Posted 2 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data . This role involves working closely with business stakeholder and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with Data Product Owners, Data Stewards and technology teams to increase the trust and reuse of data across Amgen. Roles & Responsibilities: Responsible for the execution of data governance framework for a given domain of expertise (Research, Development, Supply Chain, etc.). Contribute to the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Contribute to the cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. Partner with business teams to identify compliance requirements with data privacy, security, and regulatory policies for the assigned domains Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) delivers data foundations. Build strong relationship with key business leads and partners to ensure their needs are met. Functional Skills: Must-Have Functional Skills: Technical skills (Advanced SQL, Python etc) with knowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) Experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation General knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics Excellent problem-solving skills and a committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience with Agile software development methodologies (Scrum) Soft Skills: Excellent analytical skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to build business relationships and understand end-to-end data use and needs. Strong verbal and written communication skills Basic Qualifications: Experience with 5 - 9 years of experience in Business, Engineering, IT or related field
Posted 2 weeks ago
9.0 - 12.0 years
11 - 14 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking a Data Solutions Architect with deep expertise in Biotech/Pharma to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that initiatives in enterprise. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Well versed in domain of Biotech/Pharma industry and has been instrumental in solving complex problems for them using data strategy. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications 9 to 12 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.
Posted 2 weeks ago
3.0 - 8.0 years
20 - 30 Lacs
Ahmedabad
Hybrid
Compare and match data between systems; investigate and fix mismatches. Help build dashboards, support audits, and maintain clear documentation. Manage new data entries, ensure accuracy, and oversee smooth data processes. Required Candidate profile 3 to 5 years experience Experience of Data Governance and systems related DG Confidence in using applications, some systems experience - SAP, HFM, Oracle, Snowflake, Autonomy to review and research
Posted 2 weeks ago
3.0 - 7.0 years
8 - 14 Lacs
Ahmedabad
Work from Office
3–5 yrs exp in data reconciliations (Catalyst/Keystone to GFIN, GFIN vs HFM), dashboard support (Power BI), audit support, and data governance (MDG). Proactive, Excel-savvy, system-fluent (SAP, HFM, Oracle), with strong analytical and comms skills.
Posted 2 weeks ago
10.0 - 20.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Job Title: SAP DATA STEWARD (CONTRACT) Location: HYDERABAD, INDIA As the SAP Data Steward is responsible for creating, maintaining, and deactivating master data and data attributes in SAP with focus on data migration. The Data Stewards has an essential role in establishing in monitoring existing and new data against and ensuring timely and high-quality creation of new data in the system. Bachelors Degree or Associates degree with additional 9+ years of work experience required or an equivalent combination of education and experience. Requires SAP functional knowledge on SAP Routings with Migration Perspective. Should have complete knowledge on SAP Routings tables. Need to have basic knowledge on linking between the tables and joins needed for getting the extract template ready as per Client's Format. Excellent attention to detail, exceptional interest in creating order and consistency required. 10+ years of experience in data management and/or data governance activities and responsibilities. Experience working with SAP ECC required. Demonstrated expert-level experience and capability with MS Excel required. High degree of initiative and ownership, as well as a proven history of delivering results while working with several different departments in a fast-paced environment required. Experience creating and running business reports and data queries is preferred. Confident user of Microsoft Office (Word, Excel, Outlook, PowerPoint, Teams). Experience working with teams across multiple functions. Ability to multi-task and work under tight timelines required. Excellent communication skills both verbal and written.
Posted 2 weeks ago
8.0 - 13.0 years
11 - 21 Lacs
Bengaluru
Work from Office
Key Responsibilities :- 1. Solution Design & Implementation: Configure and implement SAP HCM modules, including Personnel Administration (PA), Organizational Management (OM), Time Management, Payroll, and Employee Self- Service (ESS)/Manager Self-Service (MSS). Develop customized solutions for complex HR and payroll requirements, including statutory compliance and reporting. Integrate SAP HCM with third-party systems for payroll, benefits, and time tracking. 2. Support & Optimization: Provide end-to-end support for SAP HCM modules, addressing user queries, system issues, and enhancements. Optimize existing configurations and processes to improve system performance and user experience. 3. Cross-Module Integration: Ensure seamless integration of HCM modules with other SAP modules such as FI/CO, Success Factors and SAP Fiori. Collaborate with technical teams to implement interfaces, reports, and workflows. 4. Emerging Technology Adoption: Support and configure SAP SuccessFactors Employee Central and its integration with SAP HCM Leverage SAP Fiori apps to enhance the user experience for HR and payroll processes. 5. Stakeholder Collaboration: Collaborate with HR business teams to gather requirements, translate them into technical specifications, and deliver effective solutions. Act as a bridge between the technical and functional teams, ensuring smooth project execution. 6. Data Governance & Reporting: Ensure accurate and secure management of employee data in SAP systems. Develop and maintain reports using tools like SAP Query, Ad Hoc Reporting, or ABAP Reports. Core Must-Have Skills: • Expertise in SAP HCM modules, including: Personnel Administration (PA) Organizational Management (OM) Time Management Payroll (local and global compliance) Employee Self-Service (ESS)/Manager Self-Service (MSS) • Strong configuration and customization experience for statutory payroll and time evaluation. • Knowledge of integration with SAP FI/CO for payroll posting and reconciliations. • Experience with implementing and supporting SAP SuccessFactors Employee Central and Recruiting/Onboarding modules Hands-on experience with HR Renewal functionalities and SAP Fiori for HR processes Desirable: Good-to-Have Skills: • Familiarity with SAP BTP for extending HR functionalities. • Understanding of Talent Management Suite (Learning, Performance, Succession Planning). • Experience with implementing global payroll solutions for multi- geography operations. • Proficiency in developing custom HR reports using ABAP HR or SAP Analytics Cloud (SAC). Market Standard Expectations 1. Certifications: SAP HCM or SAP SuccessFactors certifications Payroll certification specific to regional compliance (e.g., Nordic). 2. Project Experience: Exposure to end-to-end SAP HCM implementation and upgrade projects. Hands-on experience with SAP ECC to S/4HANA migration projects. 3. Emerging Technologies: Knowledge of AI/ML-driven HR solutions integrated with SAP systems. Experience in leveraging robotic process automation (RPA) for HR workflows.
Posted 2 weeks ago
9.0 - 14.0 years
10 - 20 Lacs
Hyderabad, Bengaluru
Hybrid
Job Title Atlan Data Governance Implementation Engineer (Hands-On Role) Key Responsibilities Atlan Deployment & Connect Required Skills and Qualifications Minimum 8+ years of relevant experience in data governance, data analytics, or related fields. Strong understanding of data governance principles and best practices. Experience with data profiling, validation, and cleansing tools and techniques. Ability to analyze data, identify patterns, and recommend solutions. Excellent communication and interpersonal skills. Ability to work independently and collaboratively in a team environment. Hands-on experience with relational databases and data warehouses. Knowledge of data quality metrics, dashboards, and reporting. Experience with data governance tools and technologies. Strong analytical and problem-solving skills. Proficient in Python, REST API development, Pandas, and Numpy. Experience with cloud platforms such as Azure, GCP, or AWS. Proficient in SQL and RDBMS. Let me know if you want me to tailor this further for any specific use!
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane