Spec, IT Professional

5 - 10 years

9 - 13 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

This is where your work makes a difference.
At Baxter, we believe every person regardless of who they are or where they are from deserves a chance to live a healthy life. It was our founding belief in 1931 and continues to be our guiding principle. We are redefining healthcare delivery to make a greater impact today, tomorrow, and beyond.
Our Baxter colleagues are united by our Mission to Save and Sustain Lives. Together, our community is driven by a culture of courage, trust, and collaboration. Every individual is empowered to take ownership and make a meaningful impact. We strive for efficient and effective operations, and we hold each other accountable for delivering exceptional results.
Here, you will find more than just a job you will find purpose and pride.
Job Summary
Perform development work and technical support related to our data transformation and ETL jobs in support of a global data warehouse. Can communicate results with internal customers. Requires the ability to work independently, as well as in cooperation with a variety of customers and other technical professionals.
What youll be doing
  • Development of new ETL/data transformation jobs, using PySpark and IBM DataStage in AWS.
  • Enhancement and support on existing ETL/data transformation jobs.
  • Can explain technical solutions and resolutions with internal customers and communicate feedback to the ETL team.
  • Perform technical code reviews for peers moving code into production.
  • Perform and review integration testing before production migrations.
  • Provide high level of technical support, and perform root cause analysis for problems experienced within area of functional responsibility.
  • Can document technical specs from business communications.
What youll bring
  • 5+ years of ETL experience.
  • Experience with core Python programming for data transformation.
  • Intermediate-level PySpark skills. Can read, understand and debug existing code and write simple PySpark code from scratch.
  • Strong knowledge of SQL fundamentals, understanding of subqueries, can tune queries with execution hints to improve performance.
  • IBM DataStage experience preferred.
  • Able to write SQL code sufficient for most business requirements for pulling data from sources, applying rules to the data, and stocking target data
  • Proven track record in troubleshooting ETL jobs and addressing production issues like performance tuning, reject handling, and ad-hoc reloads.
  • Proficient in developing optimization strategies for ETL processes.
  • Basic AWS technical support skills. Has ability to log in, find existing jobs and check run status and logs
  • Will run and monitor jobs running via Control-M
  • Can create clear and concise documentation and communications.
  • Can document technical specs from business communications.
  • Ability to coordinate and aggressively follow up on incidents and problems, perform diagnosis, and provide resolution to minimize service interruption
  • Ability to prioritize and work on multiple tasks simultaneously
  • Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
  • A self-starter who can work well independently and on team projects.
  • Experienced in analyzing business requirements, defining the granularity, source to target mapping of the data elements, and full technical specification.
  • Understands data dependencies and how to schedule jobs in Control-M.
  • Experienced working at the command line in various flavors of UNIX, with basic understanding of shell scripting in bash and korn shell.
Education and/or Experience
  • Bachelors of Science in computer science or equivalent
  • 5+ years of ETL and SQL experience
  • 3+ years of python and PySpark experience
  • 3+ years of AWS and unix experience
  • 2+ years of IBM DataStage experience
  • Preferred certifications: AWS Certified Cloud Practitioner (amazon.com) Certified DataStage ProfessionalPython and PySpark certifications
AWS Certified Cloud Practitioner
The AWS Certified Cloud Practitioner validates foundational, high-level understanding of AWS Cloud, services, and terminology.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Baxter logo
Baxter

Healthcare, Medical Devices

Deerfield

RecommendedJobs for You

noida, chennai, bengaluru