6 Gcp Dataproc Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are a skilled and proactive Python / PySpark Developer sought to join the data engineering or analytics team. Your responsibilities will include building scalable data pipelines, performing large-scale data processing, and collaborating with data scientists, analysts, and business stakeholders. You will design, develop, and optimize ETL data pipelines using PySpark on big data platforms (e.g., Hadoop, Databricks, EMR). Writing clean, efficient, and modular code in Python for data processing and integration tasks is essential. Working with large datasets to extract insights, transform raw data, and ensure data quality will be part of your daily tasks. Collaborating with cross-functional t...

Posted 1 month ago

AI Match Score
Apply

5.0 - 7.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Oracle Cloud is a comprehensive enterprise-gradeplatform that offers best-in-class services across Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS). Oracle Cloud platform offers choice and flexibility for customers to build, deploy, integrate, and extend applications in the cloud that enable adapting to rapidly changing business requirements, promote interoperability and avoid lock-in. This platform supports numerous open standards (SQL, HTML5, REST, and more), open-source solutions (such as Kubernetes, Hadoop, Spark and Kafka) and a wide variety of programming languages, databases, tools and integration frameworks. You're Opportunity: Values ...

Posted 1 month ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP Data Engineer, you will be an integral part of our international team, utilizing your expertise in Google Cloud Platform's data tools, specifically DataProc and BigQuery. Your primary focus will be on designing, developing, and optimizing data pipelines and infrastructure to enhance business insights. This role requires strong collaboration skills as you will work remotely with cross-functional teams. Your responsibilities will include designing and maintaining scalable data pipelines and ETL processes on GCP, utilizing DataProc and BigQuery for processing and analyzing large data volumes, writing efficient code in Python and SQL, and developing Spark-based data workflows with PySpa...

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Why PWC A...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a DevOps engineer at C1X AdTech Private Limited, a global technology company, your primary responsibility will be to manage the infrastructure, support development pipelines, and ensure system reliability. You will play a crucial role in automating deployment processes, maintaining server environments, monitoring system performance, and supporting engineering operations throughout the development lifecycle. Our objective is to design and manage scalable, cloud-native infrastructure using GCP services, Kubernetes, and Argo CD for high-availability applications. Additionally, you will implement and monitor observability tools such as Elasticsearch, Logstash, and Kibana to ensure full system...

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

15 - 20 Lacs

Hyderabad, Bengaluru

Hybrid

Required Key skills: Must Have: GCP Big Query, GCP Composure; GCP DataProc; Airflow, SQL, Hive, HDFS Architecture, Python, PySpark Good to have : GCP other services, Other Cloud, NoSQL Dbs

Posted 4 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies