Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role, you will be a Senior Portfolio Analyst (Visual Storytelling Specialist) with deep expertise in biotechnology, pharmaceuticals, and life sciences to drive data-driven decision-making through strategic portfolio analysis, advanced data visualization, and executive storytelling. This role will be responsible for transforming complex R&D, clinical, and commercial portfolio data into visually compelling, insight-driven narratives that enable senior leadership to make informed investment and pipeline decisions. The ideal candidate is a data-driven strategist and visual communicator, with expertise in business intelligence (BI), portfolio management, analytics, and visualization tools such as Power BI, Tableau, or Looker. This individual will play a key role in shaping the drug development pipeline, investment prioritization, and market access strategies by presenting clear, actionable insights through interactive dashboards, executive presentations, and data storytelling techniques. work Develop and lead portfolio analytics strategies, transforming R&D, clinical, regulatory, and commercial data into compelling, insight-rich visualizations for decision-makers. Design and build interactive presentations & reports using Microsoft PowerPoint & Power BI, Tableau, or similar BI tools, ensuring data is intuitive, engaging, and business-relevant. Translate complex portfolio data (clinical trial progress, regulatory milestones, pipeline prioritization, market trends) into concise visual narratives that facilitate executive decision-making. Collaborate with cross-functional teams (R&D, Finance, Commercial, Regulatory, Market Access) to synthesize data from multiple sources, aligning insights with business strategy. Track and monitor key portfolio performance indicators, including pipeline investments, resource allocation, clinical success rates, and commercialization forecasts. Establish and maintain portfolio data governance, ensuring accuracy, consistency, and integrity of information used for strategic decision-making. Drive scenario planning and predictive modeling, leveraging AI/ML-powered BI tools to assess portfolio risks, opportunities, and trade-offs. Develop executive-ready presentations, infographics, and business cases, ensuring leadership has clear, data-backed insights to guide portfolio investment and resource allocation. Analyze competitive intelligence, industry trends, and regulatory updates, integrating insights into portfolio planning and lifecycle management strategies. Continuously refine visualization frameworks, adopting the latest data storytelling techniques, design principles, and BI automation to improve stakeholder engagement. What we expect of you Master s degree and 4 to 6 years of experience in Management Analytics consulting OR Bachelor s degree and 6 to 8 years of experience in Management Analytics consulting OR Diploma and 10 to 12 years of experience in Management Analytics consulting experience Basic Qualifications: Experience in portfolio analysis, business intelligence (BI), or data visualization, with a strong background in the biotech/pharmaceutical industry. Expertise in Microsoft PowerPoint, Excel, Power BI, Tableau, Looker, Qlik Sense, or other BI visualization tools for executive reporting and data storytelling. Strong understanding of drug development lifecycle, including clinical trials (Phase I-IV), regulatory milestones, market access, and commercialization strategies. Proficiency in data modeling, SQL, Excel, and analytical scripting (DAX, Power Query M, or Python/R for analytics). Experience working with R&D, Commercial, and Financial teams in a biotech/pharma setting, translating scientific and business data into actionable insights. Strong ability to synthesize complex datasets into executive-level dashboards, visual reports, and storytelling presentations. Knowledge of portfolio management frameworks, risk analysis, and scenario modeling within pharmaceutical pipeline planning. Experience integrating industry-standard data sources such as ClinicalTrials.gov, Evaluate-Pharma, IQVIA, FDA databases, and commercial market research. Exceptional communication and stakeholder management skills, with experience in engaging C-suite executives, board members, and scientific leadership. Ability to manage multiple high-priority projects, ensuring on-time delivery in a fast-paced, highly regulated environment. Certification in BI & Data Analytics (Microsoft Certified: Power BI Data Analyst, Tableau Desktop Certified, Looker Certified, etc.). Experience with AI/ML-driven BI solutions, including predictive analytics, anomaly detection, and natural language processing (NLP) for BI. Familiarity with Lean Portfolio Management (LPM), Agile SAFe methodologies, and enterprise BI governance strategies. Preferred Qualifications: Expertise in Power BI, Tableau, or Looker for developing interactive dashboards, executive reports, and data storytelling for decision-making. Strong understanding of clinical trials (Phase I-IV), regulatory achievements (FDA, EMA), R&D investments, and drug commercialization strategies. Proficiency in SQL, DAX, Power Query (M), and Excel, with experience in data modeling, financial forecasting, and scenario analysis. Experience in pipeline prioritization, resource allocation, probability of success (PoS) modeling, and competitive intelligence analysis. Ability to translate complex portfolio data into executive-ready insights, using data visualization, storytelling techniques, and critical communication. Soft Skills: Excellent analytical and fixing skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and diligent. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT .
Posted 1 week ago
8.0 - 10.0 years
13 - 17 Lacs
Hyderabad
Work from Office
We are seeking an experienced Senior Manager, Data Engineering to lead and scale a strong team of data engineers. This role blends technical depth with strategic oversight and people leadership. The ideal candidate will oversee the execution of data engineering initiatives, collaborate with business analysts and multi-functional teams, manage resource capacity, and ensure delivery aligned to business priorities. In addition to technical competence, the candidate will be adept at managing agile operations and driving continuous improvement. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, leveraging AWS or other preferred platforms. Lead and motivate a strong data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. Lead and manage a team of data engineers, ensuring appropriate workload distribution, goal alignment, and performance management. Work closely with business analysts and product collaborators to prioritize and align engineering output with business objectives. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelor s degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered OR Diploma and 14 to 18 years of computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop strong data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Strong communication skills for collaborating with business and technical teams alike. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certification: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
8.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will lead an Agile product squad and responsible for defining the vision & strategy and implementation for a range of Clinical Data products supporting Amgen Clinical Trial Design & Analytics. You will collaborate closely with statisticians, data scientists, data engineers, and AI/ ML engineers teams to understand business needs, identify system enhancements, and drive system implementation projects. Your extensive experience in business analysis, system design, and project management will enable you to deliver innovative and effective technology products. Roles & Responsibilities: Define and communicate the product feature vision, including both technical / architectural features and enablement, and end-user features, ensuring alignment with business objectives across multiple solution collaborator groups Create, prioritize, and maintain the feature backlog, ensuring that it reflects the needs of the business and collaborators Collaborate with collaborators to gather and document product requirements, user stories, and acceptance criteria Work closely with the business teams, Scrum Master and development team to plan and implement sprints, ensuring that the highest priority features are delivered Oversee the day-to-day management of technology platforms, ensuring that they meet performance, security, and availability requirements Ensure that platforms comply with security standards, regulatory requirements, and organizational policies Assure that AIN team is successfully creating robust written materials, including product documentation, product backlog and user stories, and creating other need artifacts to assure efficient and effective coordination across time zones. Oversee the resolution of service-related incidents and problems, ensuring minimal impact on business operations Maintains in-depth knowledge of clinical development business domains with an emphasis in data assets and data pipelines, as well as an understanding of the multi-functional dependencies. Analyze customer feedback and support data to identify pain points and opportunities for product improvement What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 8 to 10 years of Information Systems experience OR Bachelor s degree and 10 to 14 years of Information Systems experience OR Diploma and 14 to 18 years of Information Systems experience A solid foundation in modern software design and engineering practices and business analysis. Proven experience in undemanding and gather business requirements and delivered insight, and achieved concrete business outcome. Technical Proficiency: Good understanding of the following technologies: Python, R, AI/ML frameworks, relational databases/data modeling, AWS services ( EC2, S3, Lambda, ECS, IAM), Docker and CI/CD/Gitlab, Apache/Databricks, Expert understanding and experience of clinical development process within Life Sciences (global clinical trial data sources, SDTM & AdaM, end-to-end clinical data design and analysis pipeline, clinical data security and governance) Experience in Agile product development as a participating member of a scrum team and related ceremonies and processes Ability to collaborate with data scientists and data engineers to deliver functional business requirements as well defining product roadmap. High learning agility, demonstrated ability of quickly grasp ever changing technology and clinical development domain knowledge and applied to the project work. Strong communications skills in writing, speaking, presenting and time management skills. Preferred Qualifications: Training or education degree in Computer Science, Biology, or Chemistry. Experience with Clinical Data and CDISC (SDTM and ADaM) standard Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity, particularly about data patterns, and learning about business processes and life of the user Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation of varied audiences through complex technical/business topics Confidence in leading teams through prioritization and sequencing discussions, including managing collaborator expectations Ability to work effectively with global, virtual teams, specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
4.0 - 9.0 years
40 - 45 Lacs
Hyderabad
Work from Office
We are seeking a Senior Data Engineer with expertise in Graph Data technologies to join our data engineering team and contribute to the development of scalable, high-performance data pipelines and advanced data models that power next-generation applications and analytics. This role combines core data engineering skills with specialized knowledge in graph data structures, graph databases, and relationship-centric data modeling, enabling the organization to leverage connected data for deep insights, pattern detection, and advanced analytics use cases. The ideal candidate will have a strong background in data architecture, big data processing, and Graph technologies and will work closely with data scientists, analysts, architects, and business stakeholders to design and deliver graph-based data engineering solutions. Roles & Responsibilities: Design, build, and maintain robust data pipelines using Databricks (Spark, Delta Lake, PySpark) for complex graph data processing workflows. Own the implementation of graph-based data models, capturing complex relationships and hierarchies across domains. Build and optimize Graph Databases such as Stardog, Neo4j, Marklogic or similar to support query performance, scalability, and reliability. Implement graph query logic using SPARQL, Cypher, Gremlin, or GSQL, depending on platform requirements. Collaborate with data architects to integrate graph data with existing data lakes, warehouses, and lakehouse architectures. Work closely with data scientists and analysts to enable graph analytics, link analysis, recommendation systems, and fraud detection use cases. Develop metadata-driven pipelines and lineage tracking for graph and relational data processing. Ensure data quality, governance, and security standards are met across all graph data initiatives. Mentor junior engineers and contribute to data engineering best practices, especially around graph-centric patterns and technologies. Stay up to date with the latest developments in graph technology, graph ML, and network analytics. What we expect of you Must-Have Skills: Hands-on experience in Databricks, including PySpark, Delta Lake, and notebook-based development. Hands-on experience with graph database platforms such as Stardog, Neo4j, Marklogic etc. Strong understanding of graph theory, graph modeling, and traversal algorithms Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies with strong problem-solving and analytical skills Excellent collaboration and communication skills, with experience working with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Master s degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelor s degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way.
Posted 1 week ago
4.0 - 6.0 years
40 - 45 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architecture. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 4to 6 years of Computer Science, IT or related field experience OR Bachelor s degree and 6 to 8 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Preferred Qualifications: Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and collaboration skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
2.0 - 7.0 years
40 - 45 Lacs
Hyderabad
Work from Office
In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, build, and support data ingestion, transformation, and delivery pipelines across structured and unstructured sources within the enterprise data engineering. Manage and monitor day-to-day operations of the data engineering environment, ensuring high availability, performance, and data integrity. Collaborate with data architects, data governance, platform engineering, and business teams to support data integration use cases across R&D, Clinical, Regulatory, and Commercial functions. Integrate data from laboratory systems, clinical platforms, regulatory systems, and third-party data sources into enterprise data repositories. Implement and maintain metadata capture, data lineage, and data quality checks across pipelines to meet governance and compliance requirements. Support real-time and batch data flows using technologies such as Databricks, Kafka, Delta Lake, or similar. Work within GxP-aligned environments, ensuring compliance with data privacy, audit, and quality control standards. Partner with data stewards and business analysts to support self-service data access, reporting, and analytics enablement. Maintain operational documentation, runbooks, and process automation scripts for continuous improvement of data fabric operations. Participate in incident resolution and root cause analysis, ensuring timely and effective remediation of data pipeline issues. Create documentation, playbooks, and best practices for metadata ingestion, data lineage, and catalog usage. Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must have Skills: Build and maintain data pipelines to ingest and update metadata into enterprise data catalog platforms in biotech or life sciences or pharma. Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. experience in data engineering, data operations, or related roles, with at least 2+ years in life sciences, biotech, or pharmaceutical environments. Experience with cloud platforms (e.g., AWS, Azure, or GCP) for data pipeline and storage solutions. Understanding of data governance frameworks, metadata management, and data lineage tracking. Strong problem-solving skills, attention to detail, and ability to manage multiple priorities in a dynamic environment. Effective communication and collaboration skills to work across technical and business stakeholders. Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Basic Qualifications: Master s degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelor s degree and 5 to 8 + years of Computer Science, IT or related field experience Diploma and 7 to 9 years of Computer Science, IT or related field experience Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills : Excellent verbal and written communication skills. High degree of professionalism and interpersonal skills. Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
9.0 - 12.0 years
20 - 25 Lacs
Hyderabad
Work from Office
We are looking for an experienced SAP Master Data Governance (MDG) Product owner to co-design and drive the implementation of SAP Master Data Governance (MDG-Finance) solutions. In this role, you will architect scalable, innovative systems, provide expert technical guidance, configuration, development and maintenance that align with Amgens strategic objectives. You will collaborate closely with the MDG Product Owner, MDM Technical lead, and other SAP S/4 Functional and technical architects and other functional MDM teams to implement, enhance and optimize MDG Master data replications and Integrations, ensuring SAP MDG delivers maximum value across the organization. Roles & Responsibilities: Collaborate with business collaborators to understand data governance requirements and translate them into effective MDG solutions. Design, configure, and implement SAP MDG solutions for Finance data domains (Finance (MDG- GL Account, MDG-Cost Center, MDG - Profit center and MDG - CC Hierarchies, FRS (Financial reporting structure & Item), Cost Element, and Internal Orders. Provide technical leadership and guidance to development teams, ensuring alignment to best practices and standards. Configure and customize SAP MDG on SAP S/4 Hana accordance with the MDM strategy. Develop and maintain data models, workflows, and business rules within the MDG framework. Collaborate with multi-functional teams to integrate MDG with other SAP modules and external systems. Ensure compliance with data governance policies and standards. Participate in project planning, estimation, and risk assessment. Mentor junior team members and contribute to knowledge sharing. Create comprehensive technical documentation, including design specifications, architecture diagrams, and user guides. Conduct training sessions for key partners and end-users as needed. Follow Agile software development methods to design, build, implement, and deploy. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: 9 to 12 years of Business, Engineering, IT or related field experience Expertise in the implementation of SAP MDG solution (configuration, design, build, test and deploy) for finance Domains like MDG- GL Account, MDG-Cost Center, MDG - Profit center and MDG - CC Hierarchies Experience working with SAP FI/CO, MM, SD or PM. Deep understanding on key SAP MDG concepts - Data Modeling, UI Modelling, Process Modelling, Governance Process, Mass Processing, DRF, DIF, BRF+ and Consolidation Features + DQM. Experience in configuring rule-based Workflows (serial, parallel and combination) and User interface modelling. Preferred Qualifications: Expertise in Implementation of SAP MDG Solution for masters like Material, BP - Vendor and Customer, & custom data models etc. Experience with SAP UI Technologies - FIORI, WebDynpro ABAP, WEBUI, FPM and UI5 Experience working in SAFe development environment. Understanding of enterprise data strategy, data governance, data infrastructure. Experience in integration techniques like ALE, SOA Experience with collaborator management, ensuring seamless coordination across teams and driving the successful delivery of technical projects. Experience with SAP Business Technology Platform (BTP), DI(Data Intelligence), CPI(Cloud Platform Integration) & SAP Datasphere. Familiar with AWS, Azure, or Google Cloud Professional Certifications: SAP MDG certification (preferred) SAP MM or SD Certification (preferred) SAP ABAP Certification (preferred) Agile Certified Practitioner (preferred) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical collaborators. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
12.0 - 17.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with collaborators to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Contribute to a program vision while advising and articulating program/project strategies on enabling technologies Provide guidance on application and integration development best practices, Enterprise Architecture standards, functional and technical solution architecture & design, environment management, testing, and Platform education Drive the creation of application and technical design standards which leverage best practices and effectively integrate Salesforce into Amgen s infrastructure Troubleshoot key product team implementation issues and demonstrate ability to drive to successful resolution. Lead the evaluation of business and technical requirements from a senior level Review releases and roadmaps from Salesforce and evaluate the impacts to current applications, orgs, and solutions. Identification and pro-active management of risk areas and commitment to seeing an issue through to complete resolution Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 12 to 17 years of experience. Experience with SFDC Service Cloud/ Health Cloud in a call center environment Strong architectural design and modeling skills Extensive knowledge of enterprise architecture frameworks and methodologies Experience with system integration, IT infrastructure Experience directing solution design, business processes redesign and aligning business requirements to technical solutions in a regulated environment Experience working in agile methodology, including Product Teams and Product Development models Extensive hands-on technical and solution implementation experience with the Salesforce Lightning Platform, Sales Cloud and Service Cloud, demonstrating positions of increasing responsibility and management/mentoring of more junior technical resources Demonstrable experience and ability to develop custom configured, Visualforce and Lightning applications on the platform. Demonstrable knowledge of the capabilities and features of Service Cloud and Sales Cloud. Demonstrable ability to analyze, design, and optimize business processes via technology and integration, including leadership in guiding customers and colleagues in rationalizing and deploying emerging technology for business use cases A thorough understanding of web services, data modeling, and enterprise application integration concepts, including experience with enterprise integration tools (ESBs and/or ETL tools), and common integration design patterns with enterprise systems (e.g. CMS, ERP, HRIS, DWH/DM) Demonstrably excellent, context-specific and adaptive communication and presentation skills across a variety of audiences and situations; established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Preferred Qualifications: Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Professional Certifications Salesforce Admin Advanced Admin Platform Builder Salesforce Application Architect (Mandatory) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
8.0 - 10.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with collaborators to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Contribute to a program vision while advising and articulating program/project strategies on enabling technologies Provide guidance on application and integration development best practices, Enterprise Architecture standards, functional and technical solution architecture & design, environment management, testing, and Platform education Drive the creation of application and technical design standards which leverage best practices and effectively integrate Salesforce into Amgen s infrastructure Troubleshoot key product team implementation issues and demonstrate ability to drive to successful resolution. Lead the evaluation of business and technical requirements from a senior level Review releases and roadmaps from Salesforce and evaluate the impacts to current applications, orgs, and solutions. Identification and pro-active management of risk areas and commitment to seeing an issue through to complete resolution Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Experience with SFDC Service Cloud/ Health Cloud in a call center environment Strong architectural design and modeling skills Extensive knowledge of enterprise architecture frameworks and methodologies Experience with system integration, IT infrastructure Experience directing solution design, business processes redesign and aligning business requirements to technical solutions in a regulated environment Experience working in agile methodology, including Product Teams and Product Development models Extensive hands-on technical and solution implementation experience with the Salesforce Lightning Platform, Sales Cloud and Service Cloud, demonstrating positions of increasing responsibility and management/mentoring of more junior technical resources Demonstrable experience and ability to develop custom configured, Visualforce and Lightning applications on the platform. Demonstrable knowledge of the capabilities and features of Service Cloud and Sales Cloud. Demonstrable ability to analyze, design, and optimize business processes via technology and integration, including leadership in guiding customers and colleagues in rationalizing and deploying emerging technology for business use cases A thorough understanding of web services, data modeling, and enterprise application integration concepts, including experience with enterprise integration tools (ESBs and/or ETL tools), and common integration design patterns with enterprise systems (e.g. CMS, ERP, HRIS, DWH/DM) Demonstrably excellent, context-specific and adaptive communication and presentation skills across a variety of audiences and situations; established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Preferred Qualifications: Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Professional Certifications Salesforce Admin Advanced Admin Platform Builder Salesforce Application Architect (Mandatory) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
12.0 - 17.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. We are looking for highly motivated expert Principal Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Architect and maintain robust, scalable data pipelines using Databricks, Spark, and Delta Lake, enabling efficient batch and real-time processing. Lead efforts to evaluate, adopt, and integrate emerging technologies and tools that enhance productivity, scalability, and data delivery capabilities. Drive performance optimization efforts, including Spark tuning, resource utilization , job scheduling, and query improvements. Identify and implement innovative solutions that streamline data ingestion, transformation, lineage tracking, and platform observability. Build frameworks for metadata-driven data engineering, enabling reusability and consistency across pipelines. Foster a culture of technical excellence, experimentation, and continuous improvement within the data engineering team. Collaborate with platform, architecture, analytics, and governance teams to align platform enhancements with enterprise data strategy. Define and uphold SLOs, monitoring standards, and data quality KPIs for production pipelines and infrastructure. Partner with cross-functional teams to translate business needs into scalable, governed data products. Mentor engineers across the team, promoting knowledge sharing and adoption of modern engineering patterns and tools. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 12 to 17 years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. EQUAL OPPORTUNITY STATEMENT .
Posted 1 week ago
8.0 - 10.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, using AWS or other preferred platforms. Lead and motivate an impactful data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelor s degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered; Diploma and 14 to 18 years of in computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop impactful data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
2.0 - 7.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will drive the development and implementation of our data strategy. The ideal candidate possesses a strong blend of technical expertise and data-driven problem-solving skills. As a Senior Data Engineer, you will play a crucial role in designing, building, and optimizing our data pipelines and platforms while mentoring junior engineers. Roles & Responsibilities: Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Take ownership of data pipeline projects from inception to deployment, managing scope, timelines, and risks. Ensure data quality and integrity through rigorous testing and monitoring. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Work closely with data analysts, data scientists, and business collaborators to understand data requirements. Identify and resolve complex data-related challenges. Adhere to data engineering best practices and standards. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Familiarity with code versioning using GIT, Jenkins and code migration tools. Exposure to Jira or Rally. Identifying and implementing opportunities for automation and CI/CD. Stay up to date with the latest data technologies and trends. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree and 2 years of Computer Science, IT or related field experience OR Master s degree and 8 to 10 years of Computer Science, IT or related field experience OR Bachelor s degree and 10 to 14 years of Computer Science, IT or related field experience OR Diploma and 14 to 18 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills (Not more than 3 to 4): Demonstrated hands-on experience with cloud platforms (AWS, Azure, GCP) and the ability to architect cost-effective and scalable data solutions. Proficiency in Python, PySpark, SQL. Hands on experience with big data ETL performance tuning. Strong development knowledge in Databricks. Strong analytical and problem-solving skills to address complex data challenges. Good-to-Have Skills: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced working with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experience in SQL/NOSQL database, vector database for large language models Experience with prompt engineering, model fine tuning Experience with DevOps/MLOps CICD build and deployment pipeline Professional Certifications (please mention if the certification is preferred or mandatory for the role): AWS Certified Data Engineer (preferred) Databricks Certification (preferred) Any SAFe Agile certification (preferred) Soft Skills: Initiative to explore alternate technology and approaches to solving problems. Skilled in breaking down problems, documenting problem statements, and estimating efforts. Effective communication and interpersonal skills to collaborate with multi-functional teams. Excellent analytical and solving skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
1.0 - 3.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to deliver actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Develop and maintain front-end applications using HTML, CSS, and JavaScript frameworks (React, Angular). Build and maintain back-end services using languages like Python, Java, or Node.js. Collaborate with the design and product teams to understand user needs and translate them into technical requirements. Write clean, efficient, and well-tested code. Participate in code reviews and provide constructive feedback. Maintain system uptime and optimal performance. Learn and adapt to new technologies and industry trends. Collaborate and communicate effectively with product teams. Participate in sprint planning meetings and provide estimations on technical implementation. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Cloud Computing certificate preferred. Preferred Qualifications: Hands on experience with web development, proficient with HTML, CSS, JavaScript. Hands on experience with backend development, proficient with SQL/NoSQL database, proficient in Python and SQL. Ability to learn new technologies quickly. Strong problem-solving and analytical skills. Good communication and teamwork skills. Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Good-to-Have Skills: Good understanding of data modeling, data warehousing, and data integration concepts. Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments). Machine Learning Certification (preferred on Databricks or Cloud environments. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
1.0 - 3.0 years
11 - 14 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data deliver actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and driving data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills and provides administration support for Master Data Management (MDM) and Data Quality platform, including solution architecture, inbound/outbound data integration (ETL), Data Quality (DQ), and maintenance/tuning of match rules. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Collaborate and communicate with MDM Developers, Data Architects, Product teams, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to standard processes for coding, testing, and designing reusable code/component Participate in sprint planning meetings and provide estimations on technical implementation As a SME, work with the team on MDM related product installation, configuration, customization and optimization Responsible for the understanding, documentation, maintenance, and additional creation of master data related data-models (conceptual, logical, and physical) and database structures Review technical model specifications and participate in data quality testing Collaborate with Data Quality & Governance Analyst and Data Governance Organization to monitor and preserve the master data quality Create and maintain system specific master data data-dictionaries for domains in scope Architect MDM Solutions, including data modeling and data source integrations from proof-of-concept through development and delivery Develop the architectural design for Master Data Management domain development, base object integration to other systems and general solutions as related to Master Data Management Develop and deliver solutions individually or as part of a development team Approves code reviews and technical work Maintains compliance with change control, SDLC and development standards Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience. Preferred Qualifications: Expertise in architecting and designing Master Data Management (MDM) solutions. Practical experience with AWS Cloud, Databricks, Apache Spark, workflow orchestration, and optimizing big data processing performance. Familiarity with enterprise source systems and consumer systems for master and reference data, such as CRM, ERP, and Data Warehouse/Business Intelligence. At least 2 to 3 years of experience as an MDM developer using Informatica MDM or Reltio MDM, along with strong proficiency in SQL. Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development. Good understanding of data modeling, data warehousing, and data integration concepts. Experience with development using Python, React JS, cloud data platforms. Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments). Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
8.0 - 13.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will lead and scale an impactful team of data engineers. This role blends technical depth with strategic oversight and people leadership. The ideal candidate will oversee the execution of data engineering initiatives, collaborate with business analysts and multi-functional teams, manage resource capacity, and ensure delivery aligned to business priorities. In addition to technical competence, the candidate will be adept at managing agile operations and driving continuous improvement. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, using AWS or other preferred platforms. Lead and motivate an impactful data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. Lead and manage a team of data engineers, ensuring appropriate workload distribution, goal alignment, and performance management. Work closely with business analysts and product collaborators to prioritize and align engineering output with business objectives. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop impactful data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Strong communication skills for collaborating with business and technical teams alike. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certification: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
4.0 - 9.0 years
35 - 40 Lacs
Hyderabad
Work from Office
The Director of Data Strategy and Governance will operationalize Amgen s data governance vision across the enterprise to accelerate AI innovative solutions to better serve patients. The Director will be responsible for translating the direction from the Enterprise Data Council into operational level impact deliverables, data governance policies and standards. he Director will partner with senior leadership to align data initiatives with business goals. Overall accountability for the Enterprise Data Governance program. Coordinates with data and process owners to interpret Enterprise Data Council objectives and principles to drive data governance across Amgen. Manage the team of Data Strategy & Governance Leads, which specializes in specific domains. Lead multi-functional Data Governance Forums. Drive compliance and create tactical level guides for implementation as necessary (GDPR, CCPA, etc.) Coordinate with Enterprise Data Council, data and process owners to define and monitor metrics. Escalation point of contact for operational level data and process issues Resolve or escalate data asset, process, and governance issues through interpretation of Enterprise Data Council objectives. Responsible for rolling out and increase adopting of the Enterprise Data Governance Framework, aligning broader partner community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, foundational master data management, data sharing, communication and change management. Maintain documentation and ensures their organization are the experts on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc. Identify areas for data governance improvements and help to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design changes Developing metrics to measure effectiveness and drive adoption of Data Governance policies and standards that will be applied to mitigate identified risks across the data lifecycle (e.g., capture / production, aggregation / processing, reporting / consumption). Establish enterprise level standards on the nomenclature, content, and structure of information (structured and unstructured data), metadata, glossaries, and taxonomies. Jointly with the Technology team, business functions, and enterprise teams (e.g., MDM, Enterprise Data Fabric, etc.) define the specifications shaping the development and implementation of data foundations. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree and 4 years of Information Systems experience OR Master s degree and 14 to 16 years of Information Systems experience OR Bachelor s degree and 16 to 18 years of Information Systems experience 6 years of managerial experience directly managing people and leadership experience leading teams, projects, or programs. Demonstrated leadership experience and demeanor to spearhead strategy and implementation of information standards. Technical skills with in-depth knowledge of Pharma processes with preferred specialization in a domain (e.g., Research, Clinical Trials, Commercial, etc.). Aware of industry trends and priorities and can apply to governance and policies. In-depth knowledge and experience with data governance principles and technology; can design and implement Data Governance operating models to drive Amgen s transformation to be a data driven organization. In-depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Experience with data products development life cycle, including the enablement of data dictionaries, business glossary to increase data products reusability and data literacy. Preferred Qualifications: Co-develop the data foundations and data products in collaboration with functions and Digital teams. Demonstrated willingness to make decisions and influence senior executives/multi-functional leaders. Ability to successfully implement complex projects in a fast-paced environment and in managing multiple priorities effectively. Ability to manage projects or departmental budgets. Experience with modelling tools (e.g., Visio). Basic programming skills, experience in data visualization and data modeling tools. Soft Skills: Ability to build business relationships and understand end-to-end data use and needs. Good interpersonal skills (great teammate). People management skills either in matrix or direct line function. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. Good attention to detail, quality, time management and customer focus. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
5.0 - 9.0 years
6 - 9 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will be responsible for development and maintenance of software in support of target/biomarker discovery at Amgen. Design, develop, and implement data pipelines, ETL/ELT processes, and data integration solutions Contribute to data pipeline projects from inception to deployment, manage scope, timelines, and risks Contribute to data models for biopharma scientific data, data dictionaries, and other documentation to ensure data accuracy and consistency Optimize large datasets for query performance Collaborate with global cross-functional teams including research scientists to understand data requirements and design solutions that meet business needs Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, Software Engineers and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Maintain documentation of processes, systems, and solutions What we expect of you We are all different, yet we all use our unique contributions to serve patients. The role requires proficiency in scientific software development (e.g. Python, R, Rshiny, Plotly Dash, etc), and some knowledge of CI/CD processes and cloud computing technologies (e.g. AWS, Google Cloud, etc). Basic Qualifications: Master s degree/Bachelors Degree and 5 to 9 years of Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field experience. Preferred Qualifications: 5+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms. Functional Skills: Must-Have Skills: Proficiency with SQL and Python for data engineering, test automation frameworks (pytest), and scripting tasks Hands on experience with big data technologies and platforms, such as Databricks (or equivalent), Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Excellent problem-solving skills and the ability to work with large, complex datasets Good-to-Have Skills: Experience the git, CICD and the software development lifecycle Experience with SQL and relational databases (e.g PostgreSQL, MySQL, Oracle) or Databricks Experience with cloud computing platforms and infrastructure (AWS preferred) Experience using and adopting Agile Framework A passion for tackling complex challenges in drug discovery with technology and data Basic understanding of data modeling, data warehousing, and data integration concepts Experience with data visualization tools (e.g. Dash, Plotly, Spotfire) Experience with diagramming and collaboration tools such as Miro, Lucidchart or similar tools for process mapping and brainstorming Experience writing and maintaining technical documentation in Confluence Professional Certifications: Databricks Certified Data Engineer Professional preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills High degree of initiative and self-motivation. Demonstrated presentation skills Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
1.0 - 3.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Roles & Responsibilities: Work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support various functional areas like Manufacturing, Commercial, Research and Development. Work closely with the Enterprise Data Lake delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines Research and evaluate technical solutions including Databricks and AWS Services, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc. Assist in building and managing relationships with internal and external business stakeholders Develop basic understanding of core business problems and identify opportunities to use advanced analytics Assist in reviewing 3rd party providers for new feature/function/technical fit with EEAs data management needs. Work closely with the Enterprise Data Lake ecosystem leads to identify and evaluate emerging providers of data management & processing components that could be incorporated into data platform. Work with platform stakeholders to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Keen on adopting new responsibilities, facing challenges, and mastering new technologies What we expect of you Basic Qualifications and Experience: Master s degree in computer science or engineering field and 1 to 3 years of relevant experience OR Bachelor s degree in computer science or engineering field and 3 to 5 years of relevant experience OR Diploma and Minimum of 8+ years of relevant work experience Must-Have Skills: Experience with Databricks (or Snowflake), including cluster setup, execution, and tuning Experience with common data processing libraries: Pandas, PySpark, SQL-Alchemy. Experience in UI frameworks (Angular.js or React.js) Experience with data lake, data fabric and data mesh concepts Experience with data modeling, performance tuning, and experience on relational databases Experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL Program skills in one or more computer languages - SQL, Python, Java Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab, Jenkins etc.), automated unit testing, and Dev Ops Exposure to Jira or Jira Align. Good-to-Have Skills: Knowledge on R language will be considered an advantage Experience in Cloud technologies AWS preferred. Cloud Certifications -AWS, Databricks, Microsoft Familiarity with the use of AI for development productivity, such as GitHub Copilot, Databricks Assistant, Amazon Q Developer or equivalent. Knowledge of Agile and DevOps practices. Skills in disaster recovery planning. Familiarity with load testing tools (JMeter, Gatling). Basic understanding of AI/ML for monitoring. Knowledge of distributed systems and microservices. Data visualization skills (Tableau, Power BI). Strong communication and leadership skills. Understanding of compliance and auditing requirements. Soft Skills: Excellent analytical and solve skills Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
1.0 - 3.0 years
50 - 55 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
8.0 - 13.0 years
15 - 19 Lacs
Hyderabad
Work from Office
We are seeking a Data Solutions Architect to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support enterprise data lakes, data warehouses, and real-time analytics. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 6-8 + years of experience in Computer Science, IT or related field OR Master s degree with 8-10 + years of experience in Computer Science, IT or related field OR Bachelor s degree with 10-12 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. .
Posted 1 week ago
5.0 - 10.0 years
14 - 18 Lacs
Hyderabad
Work from Office
We are seeking a Data Solutions Architect with deep R&D expertise in Biotech/Pharma to design, implement, and optimize scalable and high-performance data solutions that support enterprise analytics, AI-driven insights, and digital transformation initiatives. This role will focus on data strategy, architecture, governance, security, and operational efficiency, ensuring seamless data integration across modern cloud platforms. The ideal candidate will work closely with R&D and engineering teams, business stakeholders, and leadership to establish a future-ready data ecosystem, balancing performance, cost-efficiency, security, and usability. This position requires expertise in modern cloud-based data architectures, data engineering best practices, and Scaled Agile methodologies. Roles & Responsibilities: Design and implement scalable, modular, and future-proof data architectures that support R&D initiatives in enterprise. Develop enterprise-wide data frameworks that enable governed, secure, and accessible data across various business domains. Define data modeling strategies to support structured and unstructured data, ensuring efficiency, consistency, and usability across analytical platforms. Lead the development of high-performance data pipelines for batch and real-time data processing, integrating APIs, streaming sources, transactional systems, and external data platforms. Optimize query performance, indexing, caching, and storage strategies to enhance scalability, cost efficiency, and analytical capabilities. Establish data interoperability frameworks that enable seamless integration across multiple data sources and platforms. Drive data governance strategies, ensuring security, compliance, access controls, and lineage tracking are embedded into enterprise data solutions. Implement DataOps best practices, including CI/CD for data pipelines, automated monitoring, and proactive issue resolution, to improve operational efficiency. Lead Scaled Agile (SAFe) practices, facilitating Program Increment (PI) Planning, Sprint Planning, and Agile ceremonies, ensuring iterative delivery of enterprise data capabilities. Collaborate with business stakeholders, product teams, and technology leaders to align data architecture strategies with organizational goals. Act as a trusted advisor on emerging data technologies and trends, ensuring that the enterprise adopts cutting-edge data solutions that provide competitive advantage and long-term scalability. What we expect of you Must-Have Skills: Experience in data architecture, enterprise data management, and cloud-based analytics solutions. Well versed in R&D domain of Biotech/Pharma industry and has been instrumental in solving complex problems for them using data strategy. Expertise in Databricks, cloud-native data platforms, and distributed computing frameworks. Strong proficiency in modern data modeling techniques, including dimensional modeling, NoSQL, and data virtualization. Experience designing high-performance ETL/ELT pipelines and real-time data processing solutions. Deep understanding of data governance, security, metadata management, and access control frameworks. Hands-on experience with CI/CD for data solutions, DataOps automation, and infrastructure as code (IaC). Proven ability to collaborate with cross-functional teams, including business executives, data engineers, and analytics teams, to drive successful data initiatives. Strong problem-solving, strategic thinking, and technical leadership skills. Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with Data Mesh architectures and federated data governance models. Certification in cloud data platforms or enterprise architecture frameworks. Knowledge of AI/ML pipeline integration within enterprise data architectures. Familiarity with BI & analytics platforms for enabling self-service analytics and enterprise reporting. Education and Professional Certifications Doctorate Degree with 3-5 + years of experience in Computer Science, IT or related field OR Master s degree with 6 - 8 + years of experience in Computer Science, IT or related field OR Bachelor s degree with 8-10 + years of experience in Computer Science, IT or related field AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. .
Posted 1 week ago
4.0 - 6.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will lead a team of data engineers to build, optimize, and maintain scalable data architectures, data pipelines, and operational frameworks that support real-time analytics, AI-driven insights, and enterprise-wide data solutions. As a strategic leader, the ideal candidate will drive best practices in data engineering, cloud technologies, and Agile development, ensuring robust governance, data quality, and efficiency. The role requires technical expertise, team leadership, and a deep understanding of cloud data solutions to optimize data-driven decision-making. Lead and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous learning for solving complex problems of R&D division. Oversee the development of data extraction, validation, and transformation techniques, ensuring ingested data is of high quality and compatible with downstream systems. Guide the team in writing and validating high-quality code for data ingestion, processing, and transformation, ensuring resiliency and fault tolerance. Drive the development of data tools and frameworks for running and accessing data efficiently across the organization. Oversee the implementation of performance monitoring protocols across data pipelines, ensuring real-time visibility, alerts, and automated recovery mechanisms. Coach engineers in building dashboards and aggregations to monitor pipeline health and detect inefficiencies, ensuring optimal performance and cost-effectiveness. Lead the implementation of self-healing solutions, reducing failure points and improving pipeline stability and efficiency across multiple product features. Oversee data governance strategies, ensuring compliance with security policies, regulations, and data accessibility best practices. Guide engineers in data modeling, metadata management, and access control, ensuring structured data handling across various business use cases. Collaborate with business leaders, product owners, and cross-functional teams to ensure alignment of data architecture with product requirements and business objectives. Prepare team members for key partner discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios. Drive Agile and Scaled Agile (SAFe) methodologies, handling sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery. Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization uses the latest innovations in data engineering and architecture. What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a seasoned Engineering Manager (Data Engineering) to drive the development and implementation of our data strategy with deep expertise in R&D of Biotech or Pharma domain. Basic Qualifications: Doctorate degree OR Master s degree and 4 to 6 years of experience in Computer Science, IT or related field OR Bachelor s degree and 6 to 8 years of experience in Computer Science, IT or related field OR Diploma and 10 to 12 years of experience in Computer Science, IT or related field Experience leading a team of data engineers in the R&D domain of biotech/pharma companies. Experience architecting and building data and analytics solutions that extract, transform, and load data from multiple source systems. Data Engineering experience in R&D for Biotechnology or pharma industry Demonstrated hands-on experience with cloud platforms (AWS) and the ability to architect cost-effective and scalable data solutions. Proficiency in Python, PySpark, SQL. Experience with dimensional data modeling. Experience working with Apache Spark, Apache Airflow. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Experienced with AWS or GCP or Azure cloud services. Understanding of end to end project/product life cycle Well versed with full stack development & DataOps automation, logging frameworks, and pipeline orchestration tools. Strong analytical and problem-solving skills to address complex data challenges. Effective communication and interpersonal skills to collaborate with cross-functional teams. Preferred Qualifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Project Management certifications preferred Data Engineering Management experience in Biotech/Pharma is a plus Experience using graph databases such as Stardog or Marklogic or Neo4J or Allegrograph, etc. Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
1.0 - 3.0 years
16 - 18 Lacs
Hyderabad
Work from Office
The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing data pipelines, supporting and executing back-end web development, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in the design and development of data pipelines used for reports and/or back-end web application development Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of data modeling, data warehousing, and data integration concepts Strong understanding of AWS services Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and best practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experienced with SQL/NOSQL database, vector database for large language models Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Professional Certifications: Certified Data Engineer (preferred on Databricks or cloud environments) Certified SAFe Agilist (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Equal opportunity statement
Posted 1 week ago
1.0 - 3.0 years
16 - 18 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and performing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a crucial team member that assists in design and development of the data pipeline Build data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Must-Have Skills: Hands-on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Solid understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
1.0 - 3.0 years
50 - 55 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasets Understand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systems Design and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environments Ingest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platforms Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Expert in data quality, data validation and verification frameworks Innovate, explore and implement new tools and technologies to enhance efficient data processing Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are looking for highly motivated expert Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. Basic Qualifications: Master s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices Preferred Qualifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France