Posted:2 days ago|
Platform:
Remote
Full Time
You have less than 10 years or more than 12 years of total IT experience
You do not have hands-on experience with Python and GCP Data Engineering projects
You lack real-world experience in building and deploying ETL/ELT pipelines using GCP services (Dataflow, BigQuery, Composer, etc.)
You have no exposure to Apache Spark, Kafka, or Airflow
You are on a notice period longer than 30 days
You are looking for a purely remote role (onsite/client engagement required)
You come from a non-data background (e.g., testing, support, or non-engineering roles)
Our client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success, they empower clients and society to move confidently into the digital future.
● Design, develop, test, and maintain scalable ETL data pipelines using Python.
● Architect the enterprise solutions with various technologies like Kafka,
multi-cloud services, auto-scaling using GKE, Load balancers, APIGEE proxy API
management, DBT, using LLMs as needed in the solution, redaction of sensitive
information, DLP (Data Loss Prevention) etc.
● Work extensively on Google Cloud Platform (GCP) services such as:
○ Dataflow for real-time and batch data processing
○ Cloud Functions for lightweight serverless compute
○ BigQuery for data warehousing and analytics
○ Cloud Composer for orchestration of data workflows (on Apache Airflow)
○ Google Cloud Storage (GCS) for managing data at scale
○ IAM for access control and security
○ Cloud Run for containerized applications
● 10 to 12 years of hands-on experience in Python for backend or data engineering
projects.
● Strong understanding and working experience with GCP cloud services
(especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
● Solid understanding of data pipeline architecture, data integration, and
transformation techniques.
● Experience in working with version control systems like GitHub and knowledge of
CI/CD practices.
● Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
● Strong experience in SQL with at least one enterprise database (SQL Server,
Oracle, PostgreSQL, etc.).
● Experience in data migrations from on-premise data sources to Cloud platforms.
Strong hands-on expertise in Python for backend and data engineering projects
✔ Proven experience in Google Cloud Platform (GCP) — especially Dataflow, BigQuery, Cloud Composer, Cloud Functions, GCS, and IAM
✔ Strong understanding of ETL/ELT pipeline design, architecture, and development
✔ Hands-on experience with Apache Spark, Kafka, and Airflow (GCP Composer DAGs)
✔ Experience in developing APIs using Python FastAPI framework
✔ Experience with GKE (Google Kubernetes Engine) and Cloud Run for deployment
✔ Strong SQL skills with enterprise databases (SQL Server / Oracle / PostgreSQL)
✔ Familiarity with NoSQL databases like MongoDB, Redis, or Bigtable
✔ Proficient in Git/GitHub and CI/CD pipeline deployments for data projects
✔ Experience in data migration from on-premise sources to Cloud platforms
✔ Solid knowledge of data quality, validation, monitoring, and data governance best practices
People Prime Worldwide
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowhyderabad, telangana, india
Salary: Not disclosed
hyderabad, telangana, india
Salary: Not disclosed