Principal Data Engineer Data & Analytics (Global Supply Chain)

12 - 17 years

22 - 25 Lacs

Posted:1 hour ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role Description:

This role acts as t echnical architect and hands-on lead for Data Engineering practices across the Smart Supply Chain initiative within Amgen. Additionally, responsible for designing, building, maintaining , analyzing, and interpreting data to provide actionable insights that drive business decisions.
This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and , visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes and will architect, build, and optimize enterprise-grade data pipelines using Databricks and AWS-native services.

Roles & Responsibilities:

  • Design, develop, and maintain data solutions for data generation, collection, and processing
  • Be a key team member that assists in design and development of the data pipeline
  • Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems
  • Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions
  • Take ownership of data pipeline projects from inception to deployment, manag e scope, timelines, and risks
  • Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs
  • Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency
  • Implement data security and privacy measures to protect sensitive data
  • Leverage cloud platforms (AWS , Databricks preferred) to build scalable and efficient data solutions
  • Collaborate and communicate effectively with product teams
  • Collaborate with Data Architects, Business SMEs , and Data Scientists to design and develop end-to-end data pipeline s to meet fast paced business need s across geographic regions
  • Identify and resolve complex data-related challenges
  • Adhere to best practices for coding, testing , and designing reusable code/component
  • E xplore new tools and technologies that will help to improve ETL platform performance
  • Participate in sprint planning meetings and provide estimations on technical implementation
  • Continuously monitor data governance activities and report on compliance, data quality issues, and the effectiveness of governance initiatives

Basic Qualifications and Experience:

  • 12 - 17 years of experience in Computer Science, IT or related field

Functional Skills:

Must-Have Skills
  • Hands on experience with big data technologies and platforms , such as Databricks, Apache Spark ( Databricks ( PySpark , SparkSQL , Delta Lake) and AWS services (S3, EMR, Lambda, Glue, UC, Athena, Redshift, EKS), w orkflow orchestration, performance tuning on big data processing and the ability to work with large, complex datasets
  • Hands-on experience in orchestrating large-scale data pipelines, performance tuning, lineage tracking, and observability frameworks.
  • Proficiency in data analysis tools ( eg. SQL , Python ) and experience with data visualization tools ( Tableau, Power BI)
  • Excellent problem-solving skills Experience with DevOps practices, version control (Git), CI/CD (Jenkins), and Infrastructure as Code.
Good-to-Have Skills:
  • Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development
  • Strong understanding of data modeling, data warehousing, and data integration concepts
  • Working knowledge of unstructured data processing, vector stores, and AI-enablement for downstream analytics.
  • Strong understanding of SAP data models (ECC tables) and Supply Chain data domains.
  • Experience working in Agile/ SAFe environments with distributed global teams.

Professional Certifications:

  • Certified Data Engineer (preferred on Databricks or cloud environments)
  • Machine Learning Certification (preferred )

Soft Skills:

  • Excellent critical-thinking and problem-solving skills
  • Strong communication and collaboration skills
  • Demonstrated awareness of how to function in a team setting
  • Demonstrated presentation skills

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Amgen Inc logo
Amgen Inc

Biotechnology

Thousand Oaks

RecommendedJobs for You