Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Duties & Responsibilities

Design, develop, and maintain scalable ETL/ELT data pipelines to support business and analytics needs

Write, tune, and optimize complex SQL queries for data transformation, aggregation, and analysis

Translate business requirements into well-designed, documented, and reusable data solutions

Partner with analysts, data scientists, and stakeholders to deliver accurate, timely, and trusted datasets

Automate data workflows using orchestration/scheduling tools (Airflow, ADF, Luigi,Databricks etc.)

Develop unit tests, integration tests, and validation checks to ensure data accuracy and pipeline reliability

Document pipelines, workflows, and design decisions for knowledge sharing and operational continuity

Apply coding standards, version control practices, and peer code reviews to maintain high-quality deliverables

Proactively troubleshoot, optimize, and monitor pipelines for performance, scalability, and cost efficiency

Support function rollouts, including being available for post-production monitoring and issue resolution

Basic Qualifications

Bachelors degree in computer science, Information Systems, Engineering, or a related field

3–5 years of hands-on experience in data engineering and building data pipelines

At least 3 years of experience in writing complex SQL queries in a cloud data warehouse/ data lake environment.

At least 2 years experience in BI development ( e.g. Power BI preferred,Tableau )

Solid hands-on experience with data warehousing concepts and implementations

At least 3 year of experience with Snowflake or another modern cloud data warehouse

At least 3 year of hands-on Python/ Pyspark development (scripting, OOP, ETL/ELT automation)

Familiarity on Data modeling and Data warehousing concepts

Experience with orchestration tools (e.g., Airflow, ADF, Luigi, Databricks Workflow)

Familiarity with at least one cloud platform (AWS, Azure, or GCP)

Strong analytical, problem-solving, and communication skills

Ability to work both independently and as part of a collaborative team

Preferred Qualifications

Experience with DBT (Data Build Tool) for data transformations

Exposure to Databricks stack.

Familiarity with CI/CD and version control (Git) in data engineering projects

Exposure to the e-commerce or customer data domain

Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team

More Jobs at Global Infocity Park, Kodandarama Nagar, Perungudi, Chennai, Tamil Nadu 600096

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You