Posted:3 days ago|
Platform:
Remote
Full Time
Minimum 7+ years in data engineering with 5+ years of hands-on experience on GCP. Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC. Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations. Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning. Certifications Google Cloud Professional Data Engineer certification preferred. Key Skills Mandatory Technical Skills Advanced Python (PySpark, Pandas, pytest) for automation and data pipelines. Strong SQL with experience in window functions, CTEs, partitioning, and optimization. Proficiency in GCP services including BigQuery, Dataflow, Cloud Composer, Cloud Functions, and Cloud Storage. Hands-on with Apache Airflow, including dynamic DAGs, retries, and SLA enforcement. Expertise in API data ingestion, Postman collections, and REST/GraphQL integration workflows. Familiarity with CI/CD workflows using Git, Jenkins, or Bitbucket. Experience with infrastructure security and governance using IAM and VPC.
Royal Cyber
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Chennai, Sholinganallur
15.0 - 20.0 Lacs P.A.
Pune, Chennai, Bengaluru
0.5 - 0.5 Lacs P.A.
Gurugram
16.0 - 31.0 Lacs P.A.
Hyderabad
15.0 - 30.0 Lacs P.A.
Pune, Chennai, Coimbatore
7.0 - 16.0 Lacs P.A.
Bengaluru
22.5 - 37.5 Lacs P.A.
Hyderabad, Pune, Mumbai (All Areas)
8.0 - 16.0 Lacs P.A.
Navi Mumbai, Gurugram, Mumbai (All Areas)
14.0 - 22.5 Lacs P.A.
Hyderabad
20.0 - 30.0 Lacs P.A.
Hyderabad, Pune
13.0 - 23.0 Lacs P.A.