Home
Jobs

Data Engineer – GCP & PySpark (Iceberg/BigQuery) || (4-7 years)|| Noida

0 - 7 years

0 Lacs

Posted:3 weeks ago| Platform: Indeed logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Data Engineer – GCP & PySpark (Iceberg/BigQuery) || 4-7 years Location: Noida (In-office/Hybrid; client site if required) Experience: 4–7 years Type: Full-Time | Immediate Joiners Preferred Client: Leading Canadian-based Tech Company Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark – Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles , helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud & Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience β€” especially those who have worked with EMR and modern lakehouse stacks. Job Type: Full-time Pay: Up to β‚Ή1,000,000.00 per year Application Question(s): What is your notice period (in Days)? What is your current annual compensation (in INR)? What is your expected annual salary (in INR)? Experience: GCP services like BigQuery, Dataflow, or Dataproc: 7 years (Required) developing or maintaining Spark or PySpark jobs: 7 years (Required) Apache Iceberg : 7 years (Required) EMR or similar big data platforms: 7 years (Required) Terraform projects: 7 years (Required) SQL development and data modeling: 7 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now

My Connections Versent Systems Ltd.

Download Chrome Extension (See your connection in the Versent Systems Ltd. )

chrome image
Download Now

RecommendedJobs for You