Posted:2 months ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities Design, develop, and optimize robust data pipelines for ingestion, transformation, and storage. Build and maintain ETL/ELT workflows using Python and SQL. Implement and manage data warehouses, primarily using Snowflake. Collaborate with data analysts, data scientists, and stakeholders to understand data requirements. Monitor and troubleshoot data pipelines for performance and reliability. Ensure data quality and integrity across various data sources and destinations. Document technical processes, architecture, and workflows. Required Skills & Qualifications Bachelor’s degree in computer science, Engineering, or a related field. 4+ years of experience in data engineering or similar roles. Proficiency in Python for scripting and data pipeline development. Strong expertise in Snowflake, including data modeling, performance tuning, and security. Solid understanding of SQL and data warehousing concepts. Experience with data orchestration tools (e.g., Airflow, dbt). Familiarity with cloud platforms such as AWS, GCP, or Azure. Excellent problem-solving skills and attention to detail. Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Hyderabad, Chennai