Posted:2 weeks ago|
Platform:
Work from Office
Full Time
About the Role
Were looking for a hands-on Data Engineer who can take full ownership of the data engineering lifecycle-from data ingestion to deployment-in a hybrid cloud environment built on Microsoft Fabric and Google Cloud Platform (GCP). You'll be stepping into an existing ecosystem with mature pipelines, strong DevOps practices, and a close-knit cross-functional team.
This role is ideal for someone who is comfortable working independently, can manage multiple priorities, and is equally adept at building new pipelines and improving existing ones
Key Responsibilities
Develop, enhance, and maintain end-to-end data pipelines across Microsoft Fabric and GCP.
Work with tools like Fabric Dataflows, Synapse (Lakehouse), BigQuery, Composer, and others to support both batch and near real-time processing.
Build and maintain ELT/ETL frameworks using Python, PySpark, SQL, and relevant platform-native services.
Deploy pipelines and manage infra via Git-based CI/CD workflows (e.g., Azure DevOps / GitHub Actions).
Collaborate with data analysts, product managers, and other engineers to translate business needs into scalable data models and flows.
Monitor data quality, job performance, and proactively resolve data issues.
Support and improve our data observability, governance, and documentation practices.
Work independently and act as a point of contact for ongoing pipeline or platform-related discussions.
Required Skills
Strong experience with both Microsoft Fabric (Synapse, OneLake, Dataflows) and GCP (BigQuery, Cloud Storage, Composer).
Solid hands-on with SQL, Python, and Spark/PySpark.
Proven experience in building and deploying modular, reusable, and production-grade data pipelines.
Experience in managing version-controlled codebases and CI/CD for data workflows.
Good understanding of data warehousing principles, data modeling, and performance tuning.
Familiar with monitoring tools, job orchestration, and alerting strategies.
Nice to Have
Prior work with Power BI, Looker, or any BI integration layer.
Cloud certifications in GCP or Microsoft are a plus.
Quara Ai Tech
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now5.0 - 10.0 Lacs P.A.
kochi, bengaluru, thiruvananthapuram
7.0 - 12.0 Lacs P.A.
hyderabad, pune, bengaluru
5.5 - 15.5 Lacs P.A.
hyderabad, chennai, bengaluru
5.5 - 15.5 Lacs P.A.
kochi, coimbatore, trivandrum
5.0 - 9.0 Lacs P.A.
indore, chennai, coimbatore
12.0 - 22.0 Lacs P.A.
chennai
4.0 - 8.0 Lacs P.A.
bengaluru
4.0 - 8.0 Lacs P.A.
bengaluru
5.0 - 9.0 Lacs P.A.
hyderabad
0.5 - 3.0 Lacs P.A.