ETL with AWS and Snowflake _ Chennai / Hyderabad

4 - 9 years

0 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

  • Design, develop, and maintain data pipelines and ETL processes using AWS and Snowflake.
    Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in Python.Optimize and tune data solutions for performance and scalability.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
    Ensure data quality and integrity through rigorous testing and validation.Stay updated with the latest industry trends and technologies in data engineering.

Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

Proven experience as a Data Engineer or similar role.
Strong proficiency in AWS and Snowflake. Expertise in DBT and Python programming. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Cygnus Professionals logo
Cygnus Professionals

Staffing and Recruitment

Anytown

RecommendedJobs for You

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru