2 - 5 years
6 - 16 Lacs
Posted:1 day ago|
Platform:
Work from Office
Full Time
Design, develop, and maintain scalable ETL/ELT data pipelines to support business and analytics needs
Write, tune, and optimize complex SQL queries for data transformation, aggregation, and analysis
Translate business requirements into well-designed, documented, and reusable data solutions
Partner with analysts, data scientists, and stakeholders to deliver accurate, timely, and trusted datasets
Automate data workflows using orchestration/scheduling tools (Airflow, ADF, Luigi,Databricks etc.)
Develop unit tests, integration tests, and validation checks to ensure data accuracy and pipeline reliability
Document pipelines, workflows, and design decisions for knowledge sharing and operational continuity
Apply coding standards, version control practices, and peer code reviews to maintain high-quality deliverables
Proactively troubleshoot, optimize, and monitor pipelines for performance, scalability, and cost efficiency
Support function rollouts, including being available for post-production monitoring and issue resolution
Bachelors degree in computer science, Information Systems, Engineering, or a related field
3–5 years of hands-on experience in data engineering and building data pipelines
At least 3 years of experience in writing complex SQL queries in a cloud data warehouse/ data lake environment.
At least 2 years experience in BI development ( e.g. Power BI preferred,Tableau )
Solid hands-on experience with data warehousing concepts and implementations
At least 3 year of experience with Snowflake or another modern cloud data warehouse
At least 3 year of hands-on Python/ Pyspark development (scripting, OOP, ETL/ELT automation)
Familiarity on Data modeling and Data warehousing concepts
Experience with orchestration tools (e.g., Airflow, ADF, Luigi, Databricks Workflow)
Familiarity with at least one cloud platform (AWS, Azure, or GCP)
Strong analytical, problem-solving, and communication skills
Ability to work both independently and as part of a collaborative team
Experience with DBT (Data Build Tool) for data transformations
Exposure to Databricks stack.
Familiarity with CI/CD and version control (Git) in data engineering projects
Exposure to the e-commerce or customer data domain
Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team
Global Infocity Park, Kodandarama Nagar, Perungudi, Chennai, Tamil Nadu 600096
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now6.0 - 16.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
hyderabad
3.0 - 8.0 Lacs P.A.
8.0 - 18.0 Lacs P.A.
8.0 - 10.0 Lacs P.A.
bengaluru
32.5 - 35.0 Lacs P.A.
bengaluru
19.0 - 20.0 Lacs P.A.
13.0 - 22.5 Lacs P.A.
bengaluru
15.0 - 20.0 Lacs P.A.
bengaluru
24.0 - 48.0 Lacs P.A.