Lead II - Software Engineering

3 years

0 Lacs

Posted:15 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Role Description

Job Role:

Data Engineer – Snowflake, SQL, Python

Job Location:

Any UST

Role Overview

We are seeking an experienced

Data Engineer

with strong hands-on expertise in

Snowflake, SQL, and Python

. The role involves designing, building, and optimizing data pipelines as we transition multiple data sources and workloads to

Snowflake Cloud Data Platform

. You will work with modern data engineering frameworks, contribute to data modelling efforts, and ensure high-quality, scalable, and well-documented data solutions.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines for large-scale data ingestion and transformation.
  • Work extensively with Snowflake Cloud Data Warehouse, including schema design, performance optimization, and data governance.
  • Develop data processing scripts and automation workflows using Python (pandas/dask/vaex) and Airflow or similar orchestration tools.
  • Implement data modelling best practices (3NF, star schema, wide/tall tables) and metadata management processes.
  • Optimize SQL queries across different database engines and manage performance trade-offs.
  • Contribute to data quality, lineage, and governance integration (e.g., Collibra).
  • Collaborate with business stakeholders to gather requirements, translate into technical specifications, and deliver end-to-end solutions.
  • Support agile ways of working, participate in ceremonies, and maintain relevant documentation and artifacts.
  • Work with source control (GitHub) and follow best practices for shared codebase and CI/CD workflows.
  • Contribute to building pipelines that are robust, reliable, and support RBAC-based data access controls.

Required Skills & Experience

  • 3+ years of experience in Data Engineering or similar role.
  • Strong expertise with Snowflake, including schema design, warehouse configuration, and data product development.
  • Advanced SQL skills with experience writing optimized, high-performance queries.
  • Hands-on experience in Python for data processing, particularly with pandas or equivalent frameworks.
  • Experience with Airflow, DBT, or similar data orchestration/ELT frameworks.
  • Excellent understanding of ETL/ELT patterns, idempotency, and data engineering best practices.
  • Strong data modelling experience (3NF, dimensional modelling, semantic layers).
  • Familiarity with data governance and metadata cataloguing best practices.
  • Experience integrating data pipelines with enterprise access control / RBAC.
  • Working experience with GitHub or similar version control tools.
  • Ability to work with business stakeholders, gather requirements, and deliver scalable solutions.

Preferred / Nice To Have

  • Experience with AWS data services (S3, Glue, Lambda, IAM).
  • Knowledge of data virtualisation platforms, especially Denodo (cache management, query performance tuning).
  • Certifications in Snowflake, AWS, or Denodo.
  • Degree in Computer Science, Data Engineering, Mathematics, or related field (or equivalent professional experience).

Skills

Data Engineering,Airflow,Python,Snowflake

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
UST logo
UST

IT Services and IT Consulting

Aliso Viejo CA

RecommendedJobs for You

kochi, bengaluru, thiruvananthapuram