Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title: Data Engineer

Experience:

2–5 years

Location:

Remote

About The Role

We’re seeking a

Data Engineer

who combines strong technical skills with a structured and collaborative approach to engineering. You’ll be responsible for building reliable data pipelines and optimizing data systems that power analytics, reporting, and product intelligence. This role requires hands-on expertise in

Python, SQL, and Snowflake

.

Key Responsibilities

Data Engineering & Development

  • Design, implement, and maintain efficient ETL/ELT workflows using Python & SQL.
  • Build, optimize, and maintain scalable Snowflake data models for analytics and operational use cases.
  • Ensure high data quality, consistency, and accessibility across different teams and systems.
  • Automate repetitive data processes and continuously improve data infrastructure performance.
  • Collaborate closely with data analysts, scientists, and product teams to understand data requirements and deliver actionable datasets.

Engineering Excellence & Collaboration

  • Participate in and lead technical design reviews, ensuring alignment with established engineering standards.
  • Contribute to planning and prioritization across concurrent data projects, breaking down tasks, identifying risks, and maintaining delivery momentum.
  • Provide visibility into team progress, workloads, and potential bottlenecks, escalating issues where needed.
  • Support resourcing efforts by identifying capacity gaps and recommending solutions such as internal alignment or contractor engagement.
  • Encourage cross-team knowledge sharing and alignment on engineering practices.

Leadership & Communication

  • Demonstrate strong project management and organizational abilities, balancing technical depth with delivery focus.
  • Prepare and present technical solutions and recommendations for internal stakeholders and client discussions.
  • Ensure all technical contributions align with product goals and overall data architecture strategy.

Skills & Experience

  • Proficiency in Python (data processing, automation, APIs).
  • Advanced SQL skills (optimization, data transformation, complex queries).
  • Hands-on experience with Snowflake (data warehousing, schema design, performance tuning).
  • Working knowledge of cloud data platform - AWS and pipeline orchestration tools (Airflow, dbt, etc.).
  • Working knowledge of AWS Glue, RDS, Aurora, Redshift
  • Experience with Git, CI/CD, and modern engineering workflows.
  • Strong analytical, communication, and collaboration skills.

Why You’ll Love Working Here

  • Collaborate with talented engineers on data challenges that drive real business impact.
  • Work with a modern cloud data stack and best-in-class tools.
  • Be part of a culture that values ownership, learning, and continuous improvement.

Location:

DGS India - Bengaluru - Manyata N1 Block

Brand:

Merkle

Time Type:

Full time

Contract Type:

Permanent#DGS

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

pune, maharashtra, india

noida, uttar pradesh, india

hyderabad, chennai, bengaluru