Posted:11 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Description Summary

We are seeking a highly skilled Data Engineer to join our growing team. The ideal candidate has strong experience building and maintaining robust, scalable, cloud-native data pipelines and Datawarehouse using tools such as Snowflake, Five Tran, Airflow, and DBT. You will work closely with data analysts, scientists, and engineering teams to ensure reliable, timely, and secure data delivery.


Key Responsibilities

  • Design, develop, and maintain batch and streaming data pipelines to load DataMart’s.
  • Implement scalable data transformations using Snowflake stored procedures and orchestrate workflows via Airflow or equivalent tools.
  • Integrate with data platforms such as Snowflake, ensuring efficient data storage and retrieval.
  • Write optimized SQL and Python scripts for data manipulation and ETL processes.
  • Maintain data quality, observability, and pipeline reliability through monitoring and alerting.
  • Collaborate with analytics and business teams to deliver high-impact data solutions.
  • Adhere to best practices for version control, documentation, and CI/CD in a collaborative environment.


Qualifications


  • Bachelor’s degree in information technology or related field; or an equivalent combination of education and experience sufficient to successfully perform the key accountabilities of the job required.
  • Experience with data ingestion and orchestration tools like Five Tran, Airflow, Python
  • Exposure and good understanding of D365 ERP data
  • Prior experience working in fast-paced product or analytics teams.


Experience

  • 5+ years of hands-on experience in data engineering.
  • Strong experience with:
  • Snowflake or similar cloud data warehouses.
  • Airflow or other orchestration tools.
  • SQL and Python.
  • Strong hands-on experience in building transformation pipelines using python, Airflow and Snowflake Stored procedures
  • Write optimized SQL and Python scripts for data manipulation and ETL processes.
  • Maintain data quality, observability, and pipeline reliability through monitoring and alerting.
  • Hands-on experience with AWS, Azure, or GCP services.
  • Good understanding of data architecture, security, and performance tuning.
  • Familiarity with version control (e.g., Git), CI/CD tools, and agile workflows.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Sonata Software logo
Sonata Software

Information Technology and Services

Bangalore

RecommendedJobs for You