About Us
Zuru Tech is digitalizing the construction process of buildings all around the world. We have a multi-national team developing the world’s first digital building fabrication platform, you design, we build it!
We at ZURU develop the Zuru Home app, a BIM software meant for the general public, architects, and engineers, from here anyone can buy, design, and send to manufacturing any type of building with complete design freedom. Welcome to the future!
What are you Going to do?
Set up, configure, and maintain Airflow workflows for orchestration and scheduling of data pipelines.
Configure, deploy, and manage DBT projects for transformation, modeling, and deployment of curated datasets.
Build Python-based frameworks and scripts for data ingestion, validation, and automation of repetitive operational tasks.
Manage cloud infrastructure and services for data operations on AWS / Azure / GCP, including S3/Blob/GCS storage, IAM/RBAC, VPCs, and resource monitoring.
Implement CI/CD pipelines for deployment and operationalization of ETL/ELT workflows and DBT transformations.
Integrate streaming and batch pipelines with event-driven sources like Kafka or cloud-native messaging services.
Monitor data pipelines, identify bottlenecks, and proactively resolve failures to ensure high availability and reliability.
Collaborate with Data Engineers, Analysts, and BI teams to enable smooth operations and support Power BI or other downstream analytics.
Maintain data quality, lineage, and operational governance, including alerting and logging frameworks (ELK, Grafana).
Participate in Proof of Concepts (PoCs) for new operational tools, automation frameworks, or cloud services.
What are we Looking for?
✔3+ years of experience in data operations, cloud data engineering operations, or infrastructure support for data pipelines.
✔Hands-on experience setting up Airflow DAGs, DBT projects, and pipeline orchestration frameworks.
✔Experience building Python-based frameworks or automation scripts for ingestion, validation, or monitoring of data pipelines.
✔Strong familiarity with cloud platforms (AWS / Azure / GCP) and configuration of data services, storage, and networking.
✔Working knowledge of streaming or batch pipelines and integration with Kafka or similar messaging/event systems.
✔Experience implementing CI/CD pipelines for data pipelines and transformations.
✔Familiarity with Power BI or other BI tools to ensure curated datasets are operationally ready.
✔Understanding of data quality, lineage, and monitoring frameworks (ELK, Grafana, CloudWatch, etc.).
✔Strong analytical, troubleshooting, and operational skills to proactively resolve pipeline and infrastructure issues.
Required Skills
✔Cloud Platforms: AWS / Azure / GCP (any one), including storage, networking, and IAM.
✔Orchestration & Transformation: Airflow, DBT, CI/CD, DevOps for data pipelines.
✔Programming: Python (for ingestion frameworks, automation, monitoring).
✔Streaming & Monitoring: Kafka, ELK, Grafana, CloudWatch or equivalent.
✔Data Pipelines & Infrastructure: Cloud storage, VPCs, IAM/RBAC, resource monitoring, and batch/streaming pipelines.
✔BI Support: Power BI or similar tools (curated dataset readiness).
✔Data Ops Practices: Pipeline monitoring, operational governance, alerting, and automation frameworks.
What do we Offer?
Competitive compensation
Annual Performance Bonus
- ️ 5 Working Days with Flexible Working Hours
Annual trips & Team outings
Medical Insurance for self & family
Training & skill development programs
Work with the Global team, Make the most of the diverse knowledge
Several discussions over Multiple Pizza Parties
A lot more! Come and discover us!
#LI-CK1
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.