Data Engineer

5 years

9 - 10 Lacs

Posted:4 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

About ShyftLabs
ShyftLabs is a fast-growing data product company founded in early 2020, working primarily with Fortune 500 clients. We design and deliver cutting-edge digital and data-driven solutions that help businesses accelerate growth, improve decision-making, and create measurable value through innovation.

Position Overview
We’re looking for an experienced Data Engineer who’s passionate about building scalable, high-performance data solutions. In this role, you’ll collaborate with cross-functional teams-including Data Engineers, Analysts, and Product Managers-to design, implement, and maintain robust data pipelines and systems that power our clients’ most critical business decisions.

Key Responsibilities

  • Design, develop, and maintain data pipelines and ETL/ELT processes using Python.
  • Build and optimize scalable, high-performance data applications.
  • Collaborate with cross-functional teams to define requirements and deliver reliable solutions.
  • Develop and manage real-time streaming pipelines using Pub/Sub or Apache Beam.
  • Participate in code reviews, architecture discussions, and continuous improvement initiatives.
  • Monitor, troubleshoot, and optimize production data systems for reliability and performance.

Key Qualifications

  • 5+ years of professional experience in software or data engineering using Python.
  • Strong understanding of software engineering best practices (testing, version control, CI/CD).
  • Proven experience building and optimizing ETL/ELT pipelines and data workflows.
  • Proficiency in SQL and database concepts.
  • Experience with data processing frameworks (e.g., Pandas).
  • Understanding of software design patterns and scalable architecture principles.
  • Experience with cloud platforms (GCP preferred).
  • Knowledge of CI/CD pipelines and Infrastructure as Code tools.
  • Familiarity with containerization (Docker, Kubernetes).
  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Excellent problem-solving, analytical, and communication skills.

Preferred Qualifications

  • Experience with GCP services such as Cloud Run and Dataflow.
  • Experience with stream processing technologies (e.g., Pub/Sub).
  • Familiarity with workflow orchestration tools (e.g., Airflow).
  • Exposure to data visualization tools or libraries.
  • Knowledge of GitLab CI/CD and Terraform.
  • Experience with Snowflake, BigQuery, or Databricks.
  • GCP Data Engineer Certification is a plus.
We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

thiruvananthapuram