Posted:16 hours ago|
Platform:
On-site
Full Time
About the Role Data Engineer As a Data Engineer, you'll be an integral part of the team responsible for building and maintaining data pipelines and services that empower business insights across dashboards and reports. The role sits at the intersection of data engineering and architecture, working within a modern data stack and leveraging technologies like Snowflake, BigQuery, Airflow, Fivetran, DBT, and Looker. Key Responsibilities Design, build, and run end-to-end ETL/ELT pipelines in collaboration with Data Analysts Develop scalable and automated data workflows within a Data Mesh architecture Translate business needs into technical requirements (e.g., DBT models, tests, timings, and reports) Troubleshoot and resolve technical/data pipeline issues as they arise Lead delivery of data models and reports from discovery to deployment Perform exploratory data analysis to identify and prevent data quality issues Optimize data feed availability and performance, including CDC and delta loading techniques Champion data governance through best practices in testing, coding standards, and peer reviews Mentor junior engineers and act as a technical go-to within the team Build Looker dashboards when required Continuously seek ways to enhance pipeline delivery quality and team efficiency What We're Looking For 3+ years of experience using Snowflake or similar modern data warehouse technology Hands-on expertise with DBT, Apache Airflow, Fivetran, AWS, Git, and Looker Strong command of advanced SQL and performance tuning Experience with custom or SaaS-based data ingestion tools Proven skills in data modeling and optimization of large, complex datasets Background in ETL, data warehousing, data mining, and data pipeline architecture Familiarity with Data Mesh architectures and agile delivery environments (e.g., Scrum) High standards for code quality, including CI/CD, code review, and testing Strong technical documentation and business communication skills Basic understanding of the AWS ecosystem is a plus Fintech or digitally native company experience is desirable Additional skills in Python, data governance tools (e.g., Atlan, Alation, Collibra), or data quality platforms (e.g., Great Expectations, Monte Carlo, Soda) are advantageous
Tide Software
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Delhi, Delhi, India
Experience: Not specified
0.5 - 3.0 Lacs P.A.
Experience: Not specified
2.0 - 5.0 Lacs P.A.
Salary: Not disclosed
Salary: Not disclosed
Delhi, Delhi
Experience: Not specified
Salary: Not disclosed
Mumbai, Chennai, Bengaluru
4.0 - 8.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.
Navi Mumbai
3.0 - 7.0 Lacs P.A.
15.0 - 16.0 Lacs P.A.
Gurugram, Manesar
11.0 - 12.0 Lacs P.A.