Jobs
Interviews

317 Data Profiling Jobs - Page 12

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

6 - 15 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Job Title: MDM Developer (Any MDM Tool) Location: Bangalore / Pune / Chennai / Gurgaon / Hyderabad Experience: 7-10 Years Salary: 6-14 lpa Employment Type: Full-Time Client: Brillio Notice Period: Immediate Joiners Preferred On Payroll of: Nyxtech Hiring Contact: Yash Sharma (LinkedIn) :- linkedin.com/in/yashsharma1608 Job Description: Brillio is seeking an experienced MDM Developer to join its data engineering team. This role demands a strong background in Master Data Management (MDM) , SQL , and Python , with practical experience in data profiling , standardization , and API integrations . Candidates with hands-on exposure to Oracle MDM will be given preference. This is a high-impact role for professionals who can design and implement enterprise-wide master data solutions, ensuring data quality, consistency, and governance across platforms. Key Responsibilities: Design and implement MDM solutions using any enterprise MDM tool (Oracle, Informatica, SAP MDG, TIBCO, etc.). Develop data profiling, cleansing, and standardization processes. Integrate master data solutions with external systems via REST/SOAP APIs. Collaborate with business and technical teams to define master data entities and workflows. Build and maintain data models for customer, product, supplier, and other core domains. Write optimized SQL queries for data transformation and validation. Support data governance and stewardship activities to ensure master data integrity. Required Skills: 710 years of experience in enterprise data management or data engineering roles. Strong expertise in any leading MDM platform (Oracle MDM preferred). Proficiency in SQL for data analysis and transformation. Solid programming skills in Python for automation and custom scripting. Experience with API development and integration (REST, SOAP). Deep understanding of data quality principles , profiling tools, and standardization techniques. Excellent problem-solving and communication skills. Nice to Have: Hands-on experience with Oracle Customer Hub or Oracle Product Hub . Familiarity with data governance frameworks and data stewardship tools . Knowledge of cloud-based MDM solutions or modern data platforms. Why Join Brillio? Work on cutting-edge enterprise data solutions. Be part of a high-performing team driving data transformation across industries. Opportunity to grow in MDM, data governance, and advanced data engineering.

Posted 2 months ago

Apply

5.0 - 7.0 years

3 - 7 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

About Role : A NASDAQ-listed company that has effectively maintained its position as the front-runner technology sector, is looking to onboard a skilled Python developer keen is helping them expand the power of AI in the mobile coding ecosystem with the ultimate goal of creating new assisting AI-powered tools for the development domain. The company is developing the next generation of coding assisting agents, which will have a wide range of uses. For those who are keen to learn in a fast-paced setting, this is an exciting opportunity. Responsibilities : - Navigate and modify complex Rust codebases using CLI tools like grep and ripgrep. - Implement new features with a focus on memory safety, ownership rules, and type correctness. - Write and execute tests using cargo test, including property-based testing (proptest or quickcheck). - Refactor existing Rust code while maintaining functionality and performance. - Debug and fix memory safety, ownership, and concurrency-related issues. - Set up and manage Rust development environments using cargo, including handling dependencies and feature flags. - Ensure best practices in Rust development, including proper error handling, concurrency safety, and efficient memory usage. Requirements : - Strong experience with Rust programming language concepts, including ownership, borrowing, and lifetimes. - Familiarity with Rust frameworks like Tokio, Actix, Rocket and libraries such as Serde and Rayon. - Experience with Rust's testing ecosystem, including unit, integration, and property-based testing. - Knowledge of multi-threading and asynchronous programming in Rust. - Ability to work with complex architectural patterns and refactor code without introducing regressions. - Strong debugging skills, including fixing memory and concurrency issues. - Experience with performance profiling and benchmarking in Rust (cargo bench). - 4+ years of work experience This role provides an opportunity to work on challenging Rust engineering problems while improving AI-assisted programming workflows. If you're passionate about Rust and eager to push the boundaries of AI-driven software development, we'd love to hear from you! Nice to Have : - Experience contributing to open-source Rust projects. - Familiarity with writing Rust documentation and designing APIs with doc-tests. - Search Guidance Mandatory Skills : Rust - 3 yrs, Rust Frameworks(Tokio, Actix, Rocket) - 3 yrs, Libraries(Serde and Rayon) - 3 years Nice To Have : Rust documentation and designing APIs with doc-tests. Total Years Of exp : 5+ years experience Overlap Required : 4 hours Pst Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 months ago

Apply

1.0 - 5.0 years

11 - 15 Lacs

Pune

Work from Office

RoleData QA Lead Experience Required8+ Years LocationIndia/Remote Company Overview At Codvo, software and people transformations go hand-in-hand We are a global empathy-led technology services company Product innovation and mature software engineering are part of our core DNA Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. The Data Quality Analyst is responsible for ensuring the quality, accuracy, and consistency of data within the Customer and Loan Master Data API solution This role will work closely with data owners, data modelers, and developers to identify and resolve data quality issues. Responsibilities: Profile data to identify data quality issues. Define and implement data quality rules and standards. Perform data cleansing and data validation & Track data quality metrics and report on data quality issues. Work with data owners to resolve data quality issues. Monitor data quality trends and identify areas for improvement. Collaborate with data modelers to ensure data models support data quality requirements. Collaborate with developers to implement data quality solutions. Participate in data governance activities to ensure data quality and compliance. Stay up to date with emerging data quality technologies and trends. Skills & Qualifications: Bachelor's degree in computer science, Information Systems, or a related field. 5+ years' Experience in Data Quality Assurance SharePoint Testing / FTP or File data migration Testing (Minimum experience of testing 5 to 10 lakh files migration testing) Strong understanding of data quality principles and techniques. Experience with data profiling tools. Proficiency in SQL and data analysis. Knowledge of data governance principles and practices. Excellent communication, collaboration, and analytical skills. Ability to work independently and as part of a team. Show more Show less

Posted 2 months ago

Apply

1.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

RoleData QA Experience Required-5+ Years LocationIndia/Remote Company Overview At Codvo, software and people transformations go hand-in-hand We are a global empathy-led technology services company Product innovation and mature software engineering are part of our core DNA Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. The Data Quality Analyst is responsible for ensuring the quality, accuracy, and consistency of data within the Customer and Loan Master Data API solution This role will work closely with data owners, data modelers, and developers to identify and resolve data quality issues. Responsibilities: Profile data to identify data quality issues. Define and implement data quality rules and standards. Perform data cleansing and data validation & Track data quality metrics and report on data quality issues. Work with data owners to resolve data quality issues. Monitor data quality trends and identify areas for improvement. Collaborate with data modelers to ensure data models support data quality requirements. Collaborate with developers to implement data quality solutions. Participate in data governance activities to ensure data quality and compliance. Stay up-to-date with emerging data quality technologies and trends. Skills & Qualifications: Bachelor's degree in Computer Science, Information Systems, or a related field. 5+ years Experience in Data Quality Assurance Sharepoint Testing / FTP or File data migration Testing Strong understanding of data quality principles and techniques. Experience with data profiling tools. Proficiency in SQL and data analysis. Knowledge of data governance principles and practices. Excellent communication, collaboration, and analytical skills. Ability to work independently and as part of a team. Show more Show less

Posted 2 months ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Subject matter expert (SME) in one or more Healthcare domains Analyzes and documents client's business requirements, processes and communicates these requirements by constructing conceptual data and process models, including data dictionaries and functional design documents Collaborates with data teams, departments, other IT groups, and technology vendors to define the data needs and facilitate analytic solutions Provides input into developing and modifying data warehouse and analytic systems to meet client needs and develops business specifications to support these modifications Ability to communicate complex technical and functional concepts verbally and in writing Ability to lead socialization and consensus building efforts for innovative data and analytic solutions Identifies opportunities for reuse of data across the enterprise; profiles and validates data sources Creates test scenarios and develops test plans to be used in testing the data products to verify that client requirements are incorporated into the system design. Assists in analyzing testing results throughout the project Participates in architecture and technical reviews to verify 'intent of change' is carried out through the entire project Performs root cause analysis and application resolution Assignment of work and development of less experienced team members Ensure proper documentation and on-time delivery of all functional artifacts and deliverables Document and communicate architectural vision, technical strategies, and trade-offs to gain broad buy-in Reduce inefficiencies through rationalization and standards adherence Responsible for identifying, updating, and curating the data standardization and data quality rules Responsible for leading and providing direction for data management, data profiling and source to target mapping Responsible for optimizing and troubleshooting and data engineering processes Work independently, but effectively partner with a broader team, to design and develop enterprise data solutions Ability to creatively take on new challenges and work outside comfort zone Comfortable in communicating alternative ideas with clinicians related to information options and solutions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering, or another related field) Revenue Cycle Management Domain Experience 7+ years in a healthcare data warehouse setting and experience in profiling and analyzing disparate healthcare datasets (Financial, Clinical Quality, Value Based Care, population health, Revenue cycle analytics, Health system operations, etc.) and ability to convert this data into insights 7+ years working with healthcare datasets and ability to convert the business requirements into functional designs that are scalable and maintainable 7 + years of experience with Oracle/SQL server databases including T-SQL, PL/SQL, Indexing, partitioning, performance tuning 7+ years of experience in creating Source to Target Mappings and ETL designs (using SSIS / Informatica / DataStage) for integration of new/modified data streams into the data warehouse/data marts 5+ years of experience in designing and implementing data models to support analytics with solid knowledge of dimensional modeling concepts Experience with Epic Clarity and/or Caboodle data models Healthcare Domain Knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Noida

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together We are looking for an Associate Manager Data Analyst in NICE Reporting. He/ She will be working for BI & Claims reporting and will be responsible for PowerBI, dashboard development and help us drive in automation support using Python and Azure. Provide delivery and operational support in BI & Claims reporting. The selected individual He / She will be working on producing reports and developing dashboards in PowerBI for NICE/BI and Claims reporting and on migrating existing Manual reports to PowerBI/ as part of modernization exercise. You will work closely with external and internal stakeholders, development and build data processing solutions, solve problems, and ad hock support resolve issues. Primary Responsibilities Develop end-to-end dashboards by gathering business requirements in an iterative/agile model Collaborate with cross-functional teams and project managers from design to implementation, monitoring, and maintenance Design business analysis and data migration for departmental use Maintain and update dashboards to ensure accuracy Regularly review data reports to identify and resolve errors Analyze and collect data for various business reports and build data models in Azure and pipelines using Azure Data Factory Create insightful business reports and communicate results Coordinate with supervisors/clients and manage daily activities of business support and technical teams Build business intelligence tools, conduct analysis to identify patterns and trends, and ensure data quality using tools like Power BI, Alteryx, Python Analyze large datasets to extract meaningful insights and trends using Python and data science techniques Partner with stakeholders to understand data requirements and develop tools/models such as segmentation, dashboards, data visualizations, decision aids, and business case analysis Analyze and document requirements, evaluate, and build effective solutions Query data sources for analyses, detailed data profiling, and reporting Research complex functional data/analytical issues Perform source system analysis and gap analysis between source and target systems Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s degree or master’s in engineering with experience in Data Visualization, Data engineering and Data Science 8+ years of relevant industry experience (Healthcare preferred) Experience developing Power BI dashboards and transitioning work. Experience in requirements gathering and data warehousing systems analysis. Proficient in designing, developing scalable data pipelines, optimizing data storage solutions, and ensuring data security for cloud applications and services using Azure technologies Proficient in SQL, Power BI, Python and Machine learning Solid understanding of AI, machine learning, and solid knowledge of Python programming, including libraries for data analysis (e.g., Pandas, NumPy) and machine learning (e.g., scikit-learn, TensorFlow, Keras) Solid verbal and written communication skills to effectively collaborate with team members and stakeholders Solid skills in process support, automation, and business insights Ability to organize data, recognize trends, and develop innovative approaches Analytical skills for data-driven report development and excellent critical thinking and attention to detail At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Platform Engineer to build scalable infrastructure for data ingestion, processing, and analysis. Key Responsibilities: Architect distributed data systems. Enable data discoverability and quality. Develop data tooling and platform APIs. Required Skills & Qualifications: Experience with Spark, Kafka, and Delta Lake. Proficiency in Python, Scala, or Java. Familiar with cloud-based data platforms. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 2 months ago

Apply

2.0 - 6.0 years

6 - 11 Lacs

Hyderabad

Work from Office

As a managing consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will lead design workshops, support business development activities and mentor and coach team members to develop their skills and knowledge. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution Leadership Leading the technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Team Delivery leadership Lead and manage high performing team of SAP consultants to deliver work products on time, budget, and quality. Comprehensive Solution Delivery: Involvement in strategy development and solution implementation, leveraging your functional expertise of SAP with clients and team members and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. ‘Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS (BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 2 months ago

Apply

2.0 - 6.0 years

6 - 11 Lacs

Bengaluru

Work from Office

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS( BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 2 months ago

Apply

6.0 - 8.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:Informatica Admin – PowerCenter, IDQ, IICS Experience6-8 Years Location:Bangalore : Technical Skills: Informatica PowerCenter Administration: Install, configure, and maintain Informatica PowerCenter components (Repository Server, Integration Service, Domain Configuration) on Windows servers in AWS. Monitor and optimize PowerCenter performance, including troubleshooting and resolving issues. Informatica Data Quality (IDQ) Administration: Install, configure, and manage Informatica Data Quality (IDQ) components including IDQ Server and Data Quality Services. Ensure effective data profiling, cleansing, and enrichment processes. Informatica Intelligent Cloud Services (IICS) Migration: Plan and execute migration strategies for moving from on-premises Informatica PowerCenter and IDQ to Informatica Intelligent Cloud Services (IICS). Manage and facilitate the migration of ETL processes, data quality rules, and integrations to IICS. Ensure a smooth transition with minimal disruption to ongoing data processes. AWS Cloud Management: Manage Informatica PowerCenter, IDQ, and IICS environments within AWS, using services such as EC2, S3. Implement AWS security and compliance measures to protect data and applications. Performance Optimization: Optimize the performance of Informatica PowerCenter, IDQ, IICS, and Oracle databases to ensure efficient data processing and high availability. Conduct regular performance tuning and system health checks. Backup & Recovery: Develop and manage backup and recovery processes for Informatica PowerCenter, IDQ, and Oracle databases. Ensure data integrity and implement effective disaster recovery plans. Security & Compliance: Configure and manage security policies, user roles, and permissions for Informatica and Oracle environments. Monitor and enforce data security and compliance standards within AWS and Informatica platforms. Troubleshooting & Support: Diagnose and resolve issues related to Informatica PowerCenter, IDQ, IICS, and Oracle databases. Provide technical support and guidance to development and operational teams. Documentation & Reporting: Create and maintain detailed documentation for Informatica PowerCenter, IDQ, IICS configurations, and Oracle database settings. Generate and review performance and incident reports. Non-Technical Skill Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Ability to learn quickly in a dynamic start-up environment Able to talk to client directly and report to client/onsite Flexibility to work on different Shifts and Stretch

Posted 2 months ago

Apply

3.0 - 4.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:Data Quality Engineer Experience3-4 Years Location:Bangalore : We are seeking a detail-oriented and highly motivated Data Quality Engineerto join our growing data team. In this role, you will be responsible for designing, implementing, and maintaining data quality frameworks to ensure the accuracy, completeness, consistency, and reliability of enterprise data. You will work closely with business stakeholders, data stewards, and data engineers to enforce data governance policies and utilize tools like Ataccamato support enterprise data quality initiatives. We only need immediate joiners. Key Responsibilities: Design and implement robust data quality frameworksand rules using Ataccama ONEor similar data quality tools. Develop automated data quality checks and validation routines to proactively detect and remediate data issues. Collaborate with business and technical teams to define data quality metrics, thresholds, and standards. Support the data governance strategyby identifying critical data elements and ensuring alignment with organizational policies. Monitor, analyze, and report on data quality trends, providing insights and recommendations for continuous improvement. Work with data stewards to resolve data issues and ensure adherence to data quality best practices. Support metadata management, data lineage, and data profiling activities. Document processes, data flows, and data quality rules to facilitate transparency and reproducibility. Conduct root cause analysis on data issues and implement corrective actions to prevent recurrence. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in a Data Quality, Data Governance, or Data Engineering role. Hands-on experience with Ataccama ONE or similar data quality tools, including rule creation, data profiling, and issue management. Strong knowledge of data governance frameworks, principles, and best practices. Proficient in SQL and data analysis with the ability to query complex datasets. Experience with data management platforms and enterprise data ecosystems. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder engagement skills. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS, Azure). Familiarity with data catalog tools (e.g., Collibra, Alation). Knowledge of industry data standards and regulatory requirements (e.g., GDPR, HIPAA).

Posted 2 months ago

Apply

2.0 - 6.0 years

6 - 11 Lacs

Hyderabad

Work from Office

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS( BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 2 months ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

We are hiring Informatica CDQ Professionals with 4 to 8 years of experience for a contract position (6 months to 1 year). Type: Contract (6 months to 1 year) Start Date: Immediate joiners preferred Skills Required: Strong hands-on experience with Informatica Cloud Data Quality (CDQ) Expertise in data profiling, data cleansing, and implementing data quality rules Solid knowledge of data governance and data management Strong troubleshooting and performance optimization skills To Apply: Please share your updated resume along with: Current CTC Expected CTC Current Location Email to: navaneetha@suzva.com

Posted 2 months ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

Design and execute test cases for ETL processes to validate data accuracy and integrity. Collaborate with data engineers and developers to understand ETL workflows and data transformations. Hands-On expertise in writing complex SQL using multiple JOINS, conditions, aggregate Clauses. Strong hands on experience on Datawarehouse testing. Hands on Manual Testing. Identify, document, and track defects and issues in the ETL process. Perform data profiling and data quality assessments. Create and maintain test documentation, including test plans, test scripts, and test results The ideal candidate will have a strong background in ETL processes, data validation, and experience with Database-SQL. You will be responsible for ensuring the quality and accuracy of data as it moves through the ETL pipeline. Mandatory Competencies ETL - Tester QA Manual - Manual Testing At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, were committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees success and happiness.

Posted 2 months ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Chennai, Bengaluru

Work from Office

Role BI Data Engineer - Tableau Visit our website bmwtechworks.in to know more. Follow us on LinkedIn I Instagram I Facebook I X for the exciting updates. What awaits you/ Job Profile Design and implement Data solutions and maintain the BI Solution. Create visual analytics and insights of Data in BI/Visualization. Provide Business Users BI Visualization solutions. Good analytical/problem solving skills, algorithms, logical thinking. Collaborate with the team members for Data quality and Security compliance. What should you bring along Experience in Development of visulaization data and SQL coding. Proficiency in Tableau and / or QuickSight Experience in maintaining custom SQL code changes in the respective tool history. Knowledge of different Data Visualization tools, Dashboards, graphs, charts and complex views. Building measures, lookups and transformations of data in Tableau and/or QuickSight. Must have technical skill Good SQL, data analysis and data profiling skills . Data transformation skills using measures and data lookups. Data Modelling and creating Data Marts skills in Tableau Good UI/UX skills. Good to have skills Tableau Certification is a plus Other BI Tools knowledge, such as QuickSight

Posted 2 months ago

Apply

7.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

Informatica MDM Responsibilities: We work with customers who need to improve their capacity to respond on computerized world and empower adaptability, evacuate execution boundaries, empower advancement, and modernize center frameworks and rethink their business with Stibo MDM. Ensure effective Project Management, Project Proposals preparations, Design, Documentation, Development, Validation, Solution Architect and Support activities in line with client needs and architectural requirements. Ensure continual knowledge management. Adherence to the organizational guidelines and processes. Mandatory Experience: Min 1 Year into Technical/Functional Skills STIBO MDM Development Experience Ability to configure/develop STEP Solution components such as business rules, Workflows, Import , Exports , Web UIs, Endpoints, etc. Strong experience in data modeling and data migration Work experience in data profiling/data quality. Good Technical knowledge on Core Java & XML Experience leading / working with teams in an agile setting. STIBO certification is an added advantage. Benefits: Exposure to new processes and technologies. Competitive salary at par with the best in the industry. Flexible and employee friendly environment.

Posted 2 months ago

Apply

7.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

Abinitio: Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database: SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX: Shell Scripting (must), Unix utilities like sed, awk, perl, python Informatica IICS: Good Exp in designing and developing ETL mappings using IICS. Should be familiar with bulk loading concepts, Change Data Capture (CDC),Data Profiling and Data validation concepts . Should have prior experience working with different types of data sources/targets. Understanding configuration, migration and deployment of ETL mappings. Teradata: Assist in the design, development, and testing of Teradata databases and ETL processes to support data integration and reporting Collaborate with data analysts and other team members to understand data requirements and provide solutions DataStage: Overall experience of 5 years in DW/ BI technologies and minimum 5 years development experience in ETL DataStage 8.x/ 9.x tool. Worked extensively in parallel jobs, sequences and preferably in routines. Good conceptual knowledge on data- warehouse and various methodologies.

Posted 2 months ago

Apply

2 - 5 years

2 - 5 Lacs

Bengaluru

Work from Office

Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings

Posted 2 months ago

Apply

4 - 9 years

22 - 32 Lacs

Chennai

Work from Office

Key Responsibilities Should be currently working as Product / Project Manager- Data product management. Develop and implement data governance policies, standards, and processes to ensure data integrity, quality, and security. Collaborate with cross-functional teams to embed governance best practices into data pipelines, analytics, and BI reporting. Define and monitor data quality metrics and KPIs Setup data catalogs, classification, and metadata management for enhanced discoverability and compliance. Partner with IT, Security, and Compliance teams to ensure regulatory and policy adherence (e.g., GDPR, HIPAA). Leverage tools and technologies like SQL, Pandas Profiling , and Python to enhance data quality and governance workflows. Act as a subject matter expert on data governance strategies and tools. Skills and Experience Bachelors/Masters degree in Data Science, Information Management, Computer Science, or a related field. 8+ years of experience in data governance, data quality management, or BI reporting roles. Knowledge of data governance tools such as Collibra, OpenMetadata, DataHub , and Informatica . Proficiency in SQL , with hands-on experience in data profiling tools (e.g., Pandas Profiling). Strong understanding of data lifecycle management, privacy laws, and compliance frameworks. Excellent leadership, communication, and stakeholder management skills. Analytical mindset with experience in measuring and reporting on data quality KPIs.

Posted 2 months ago

Apply

4 - 6 years

30 - 34 Lacs

Bengaluru

Work from Office

Overview Annalect is seeking a hands-on Data QA Manager to lead and elevate data quality assurance practices across our growing suite of software and data products. This is a technical leadership role embedded within our Technology teams, focused on establishing best-in-class data quality processes that enable trusted, scalable, and high-performance data solutions. As a Data QA Manager, you will drive the design, implementation, and continuous improvement of end-to-end data quality frameworks, with a strong focus on automation, validation, and governance. You will work closely with data engineering, product, and analytics teams to ensure data integrity, accuracy, and compliance across complex data pipelines, platforms, and architectures, including Data Mesh and modern cloud-based ecosystems. This role requires deep technical expertise in SQL, Python, data testing frameworks like Great Expectations, data orchestration tools (Airbyte, DbT, Trino, Starburst), and cloud platforms (AWS, Azure, GCP). You will lead a team of Data QA Engineers while remaining actively involved in solution design, tool selection, and hands-on QA execution. Responsibilities Key Responsibilities: Develop and implement a comprehensive data quality strategy aligned with organizational goals and product development initiatives. Define and enforce data quality standards, frameworks, and best practices, including data validation, profiling, cleansing, and monitoring processes. Establish data quality checks and automated controls to ensure the accuracy, completeness, consistency, and timeliness of data across systems. Collaborate with Data Engineering, Product, and other teams to design and implement scalable data quality solutions integrated within data pipelines and platforms. Define and track key performance indicators (KPIs) to measure data quality and effectiveness of QA processes, enabling actionable insights for continuous improvement. Generate and communicate regular reports on data quality metrics, issues, and trends to stakeholders, highlighting opportunities for improvement and mitigation plans. Maintain comprehensive documentation of data quality processes, procedures, standards, issues, resolutions, and improvements to support organizational knowledge-sharing. Provide training and guidance to cross-functional teams on data quality best practices, fostering a strong data quality mindset across the organization. Lead, mentor, and develop a team of Data QA Analysts/Engineers, promoting a high-performance, collaborative, and innovative culture. Provide thought leadership and subject matter expertise on data quality, influencing technical and business stakeholders toward quality-focused solutions. Continuously evaluate and adopt emerging tools, technologies, and methodologies to advance data quality assurance capabilities and automation. Stay current with industry trends, innovations, and evolving best practices in data quality, data engineering, and analytics to ensure cutting-edge solutions. Qualifications Required Skills 11+ years of hands-on experience in Data Quality Assurance, Data Test Automation, Data Comparison, and Validation across large-scale datasets and platforms. Strong proficiency in SQL for complex data querying, data validation, and data quality investigations across relational and distributed databases. Deep knowledge of data structures, relational and non-relational databases, stored procedures, packages, functions, and advanced data manipulation techniques. Practical experience with leading data quality tools such as Great Expectations, DbT tests, and data profiling and monitoring solutions. Experience with data mesh and distributed data architecture principles for enabling decentralized data quality frameworks. Hands-on experience with modern query engines and data platforms, including Trino/Presto, Starburst, and Snowflake. Experience working with data integration and ETL/ELT tools such as Airbyte, AWS Glue, and DbT for managing and validating data pipelines. Strong working knowledge of Python and related data libraries (e.g., Pandas, NumPy, SQLAlchemy) for building data quality tests and automation scripts.

Posted 2 months ago

Apply

6 - 8 years

3 - 7 Lacs

Gurugram

Work from Office

Skills: Bachelors degree / Master's Degree with high rankings from reputed colleges Preferably 6-8 years ETL /Data Analysis experience with a reputed firm Expertise in Big Data Managed Platform Environment like Databricks using Python/ PySpark/ SparkSQL Experience in handling large data volumes and orchestrating automated ETL/ data pipelines using CI/CD and Cloud Technologies. Experience of deploying ETL / data pipelines and workflows in cloud technologies and architecture such as Azure and Amazon Web Services will be valued Experience in Data modelling (e.g., database structure, entity relationships, UID etc.) , data profiling, data quality validation. Experience adopting software development best practices (e.g., modularization, testing, refactoring, etc.) Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools Excellent written and verbal communication skills in English Self-motivated with strong sense of problem-solving, ownership and action-oriented mindset Able to cope with pressure and demonstrate a reasonable level of flexibility/adaptability Track record of strong problem-solving, requirement gathering, and leading by example Able to work well within teams across continents/time zones with a collaborative mindset

Posted 2 months ago

Apply

3 - 5 years

10 - 11 Lacs

Bengaluru

Work from Office

Your expertise in Avaloq technologies, database management, and software development combined with a solid understanding of the Avaloq Enterprise-Wide Object Model will be critical in shaping and enhancing our data capabilities. This is a fantastic opportunity to bring your passion for innovation into a fast-paced, dynamic environment where your impact will be tangible. Skills Must have You will have between 3-5 years experience working as an Avaloq Developer. Mapping physical data from Avaloq and other platforms to the Enterprise Data Model, ensuring adherence to reference data standards and clear conceptual definitions. Performing data quality assessments, developing strategies to enhance data integrity, and creating Avaloq system functionalities to mitigate the risk of poor data at source. Supporting option analysis, data profiling, and interfacing with external rule/result repositories. Reviewing and optimising Avaloq APDM outcomes, defining treatment strategies, and improving system performance. Integrating the Avaloq APDM with external data minimisation orchestration tools. Designing and delivering data quality improvement solutions, including mass-data manipulation scripts and associated testing and reconciliation processes. Analysing business requirements and developing tailored software solutions. Supporting technical analysis and enhancements for Avaloq change requests and incidents. Collaborating with stakeholders to build alignment around technology change initiatives. Building and maintaining synthetic data delivery routines for test environments. Managing market data ingestion into the Avaloq Core Platform (ACP) via third-party tools. Nice to have ACCP certification *

Posted 2 months ago

Apply

3 - 7 years

6 - 9 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data . This role involves working closely with business stakeholder and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with Data Product Owners, Data Stewards and technology teams to increase the trust and reuse of data across Amgen. Roles & Responsibilities Responsible for the execution of data governance framework for a given domain of expertise (Research, Development, Supply Chain, etc.). ? Contribute to the operationalization of the Enterprise data governance framework and aligning broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. ? Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Contribute to the cross functional alignment in his/her domain(s) of expertise to ensure adherence to Data Governance principles. ? Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. ? Partner with business teams to identify compliance requirements with data privacy, security, and regulatory policies for the assigned domains ? Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) delivers data foundations?. Build strong relationship with key business leads and partners to ensure their needs are met . Functional Skills: Must-Have Functional Skills: Technical skills (Advanced SQL, Python etc ) with knowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) Experience of working with or supporting systems used to data governance framework. E.g. Collibra, Alation General knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Customer focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics Excellent problem-solving skills and a committed attention to detail in finding solutions Good-to-Have Functional Skills: Experience with Agile software development methodologies (Scrum) Soft Skills: Excellent analytical skills Ability to work effectively with global, virtual teams Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to build business relationships and understand end-to-end data use and needs. Strong verbal and written communication skills Basic Qualifications Minimum experience with 5 - 8 years of experience in Business, Engineering, IT or related field EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 2 months ago

Apply

2 - 5 years

4 - 8 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description You will play a key role in the implementation and adoption of the data governance framework which will modernize Amgen's data ecosystem, positioning Amgen as a leader in biopharma innovation. This role leverages state-of-the-art technologies, including Generative AI, Machine Learning, and integrated data . This role involves working closely with business stakeholders and data analysts to ensure implementation and adoption of the data governance framework. You will collaborate with Data Stewards to increase the trust and reuse of data across Amgen. Roles & Responsibilities Contribute to the data governance and data management framework implementation for a given domain of expertise (Research, Development, Supply Chain, etc.). ? Assess and document with the stakeholder community their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. ? Works with Enterprise MDM and Reference Data to enforce standards and data reusability. Maintain documentation on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. for assigned domains. ? Partner with business teams to identify compliance requirements with data privacy, security, and regulatory policies for the assigned domains ? Jointly with Technology teams, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) delivers data foundations?. Functional Skills: Must-Have Functional Skills: General knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. General knowledge with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy . Customer focused with excellent written and verbal communication skills who can confidently work with internal Amgen business stakeholders and external service partners on business process and technology topics Excellent problem-solving skills and a committed attention to detail in finding solutions Good-to-Have Functional Skills: K nowledge of Pharma processes with specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.) Experience of working with or supporting systems used for data management . E.g. Collibra, Ataccama Data Quality platform. Experience with Agile software development methodologies (Scrum ) Soft Skills: Good analytical skills Ability to work effectively with global, virtual teams Team-oriented, with a focus on achieving team goals Ability to build business relationships and understand end-to-end data use and needs. Good verbal and written communication skills Basic Qualifications Minimum Experience with 2 - 5 years of experience in Business, Engineering, IT or related field EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 2 months ago

Apply

3 - 8 years

4 - 7 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Engineer with 2 –5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies