Work from Office
Full Time
What you will do Role Description: The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing data pipelines, supporting and executing back-end web development, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in the design and development of data pipelines used for reports and/or back-end web application development Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelors degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies Proficiency in workflow orchestration, performance tuning on big data processing Strong understanding of data modeling, data warehousing, and data integration concepts Strong understanding of AWS services Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and best practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experienced with SQL/NOSQL database, vector database for large language models Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Professional Certifications: Certified Data Engineer (preferred on Databricks or cloud environments) Certified SAFe Agilist (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills.
Amgen Inc
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now3.0 - 7.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
Gurugram
Experience: Not specified
2.0 - 7.0 Lacs P.A.
Kochi, Kolkata, Bhubaneswar
40.0 - 45.0 Lacs P.A.
Pune, Chennai, Bengaluru
Experience: Not specified
15.0 - 16.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
Hyderabad, Ahmedabad, Delhi / NCR
20.0 - 30.0 Lacs P.A.
Gurugram
11.0 - 15.0 Lacs P.A.
20.0 - 35.0 Lacs P.A.
1.0 - 2.0 Lacs P.A.