IT Architect (Data Engineer)

4 - 6 years

5 - 15 Lacs

Posted:3 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Were a mission-driven leader in medical technology and solutions with a legacy of integrity and innovation, join our new Minimed India Hub as Digital Engineer. We are working to improve how healthcare addresses the needs of more people, in more ways and in more places around the world. As a PySpark Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines using PySpark. You will work closely with data scientists, analysts, and other stakeholders to ensure the efficient processing and analysis of large datasets, while handling complex transformations and aggregations.

Responsibilities may include the following and other duties may be assigned:

  • Design, develop, and maintain scalable and efficient ETL pipelines using PySpark.
  • Work with structured and unstructured data from various sources.
  • Optimize and tune PySpark applications for performance and scalability.
  • Collaborate with data scientists and analysts to understand data requirements, review Business Requirement documents and deliver high-quality datasets.
  • Implement data quality checks and ensure data integrity.
  • Monitor and troubleshoot data pipeline issues and ensure timely resolution.
  • Document technical specifications and maintain comprehensive documentation for data pipelines.
  • Stay up to date with the latest trends and technologies in big data and distributed computing.

Required Knowledge and Experience:

  • Bachelors degree in computer science, Engineering, or a related field.
  • 4-5 years of experience in data engineering, with a focus on PySpark.
  • Proficiency in Python and Spark, with strong coding and debugging skills.
  • Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server).
  • Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP).
  • Experience with data warehousing solutions like Redshift, Snowflake, Databricks or Google BigQuery.
  • Familiarity with data lake architectures and data storage solutions.
  • Experience with big data technologies such as Hadoop, Hive, and Kafka.
  • Excellent problem-solving skills and the ability to troubleshoot complex issues.
  • Strong communication and collaboration skills, with the ability to work effectively in a team environment.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Medtronic logo
Medtronic

Medical Equipment Manufacturing

Minneapolis MN

RecommendedJobs for You

hyderabad, bengaluru, delhi / ncr