Posted:8 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title:

Senior Data Engineer

Employment Type:

Full-Time

Experience Required:

4+ Years

About The Role

TTF is seeking an experienced Senior Data Engineer to build and maintain scalable, secure data pipelines and infrastructure. You will collaborate with data science, analytics, and product teams to deliver best-in-class data solutions using modern cloud and orchestration technologies, ensuring compliance with security and regulatory standards.

Key Responsibilities

  • Design, develop, and maintain robust ETL/ELT pipelines using SQL and Python.
  • Orchestrate data workflows using Dagster (and legacy Airflow).
  • Collaborate across teams to enable self-service analytics and meet data requirements.
  • Implement streaming, batch, and Change Data Capture (CDC) data flows.
  • Use DBT for data transformation and modeling aligned to business needs.
  • Monitor data quality, troubleshoot issues, and optimize pipelines.
  • Ensure adherence to security, privacy, and compliance standards.

Required Skills & Experience

  • 4+ years of data engineering experience.
  • Strong SQL skills and hands-on experience with cloud data warehouses (Snowflake, BigQuery, Redshift).
  • Expertise in ETL/ELT processes, batch, and streaming data pipelines.
  • Proficient with AWS data services (S3, DMS, Glue, Athena).
  • Experience with DBT for data modeling and transformation.
  • Fluent in English communication.

Preferred Qualifications

  • Experience with additional AWS services: EC2, ECS, EKS, VPC, IAM.
  • Familiarity with Infrastructure as Code (Terraform, Terragrunt).
  • Proficiency in Python for data engineering tasks.
  • Knowledge of orchestration tools such as Dagster, Airflow, or AWS Step Functions.
  • Understanding of pub-sub and streaming frameworks (AWS Kinesis, Kafka, SQS, SNS).
  • Experience with CI/CD pipelines and automation.

Tech Stack Highlights

  • Languages: SQL, Python
  • Pipeline Orchestration: Dagster, Airflow
  • Data Stores: Snowflake, Clickhouse
  • Cloud Platforms: AWS (ECS, EKS, DMS, Kinesis, Glue, Athena, S3)
  • ETL Tools: FiveTran, DBT
  • Infrastructure: Docker, Kubernetes, Terraform with Terragrunt
Skills: aws,pyspark,glue,python,etl,elt,aws lambda,dagster,data transformation,airflow,s3,sql,data modeling,dbt

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Chennai, Tamil Nadu, India

Kolkata metropolitan area, West Bengal, India

Hyderabad, Telangana, India

Bengaluru, Karnataka, India