666 Dataproc Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 1.0 years

8 - 10 Lacs

hyderabad

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 months ago

AI Match Score
Apply

3.0 - 6.0 years

6 - 8 Lacs

noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

AI Match Score
Apply

4.0 - 6.0 years

10 - 14 Lacs

bengaluru

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

22 - 25 Lacs

bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

22 - 25 Lacs

chennai

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

22 - 25 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

AI Match Score
Apply

4.0 - 8.0 years

15 - 25 Lacs

chennai

Work from Office

Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, user...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 27 Lacs

gurugram

Work from Office

Description: Agentic AI is must Requirements: Education: Bachelor’s degree in Computer Science, Software Engineering, AI/ML, or a related field Certifications in AI/ML, Generative AI, or Cloud platforms (AWS/GCP) are a plus A Candidate should have atleast 5+ years of Experiences Job Description: The Agent Developer will be a key contributor to our initiative to build and launch assistive agents. This role involves designing, developing, and deploying AI-powered agents that enhance efficiency in program tracking, governance, and execution. The Agent Developer will be responsible for bringing the defined requirements and use cases to life, working with a variety of internal tools and data sour...

Posted 2 months ago

AI Match Score
Apply

10.0 - 15.0 years

0 - 26 Lacs

noida, chennai

Work from Office

Roles and Responsibilities : Design, develop, and maintain large-scale data pipelines using Dataproc on Google Cloud Platforms (GCP) for big data processing and analytics. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from massive datasets stored in BigQuery. Ensure high availability, scalability, security, and performance of GCP-based systems. Job Requirements : 10-15 years of experience in IT services & consulting industry with expertise in Dataproc on GCP. Strong understanding of Google Cloud Platforms (GCP) including Compute Engine, Kubernetes Engine, Storage buckets et...

Posted 2 months ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform Engineer - Tech Lead at Deutsche Bank in Pune, India, you will be part of the DB Technology global team of tech specialists. Your role involves leading a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc, and data management to develop robust data pipelines, ensure data quality, and implement efficient data management solutions. Your leadership will drive innovation, maintain high standards in data infrastructure, and mentor team members to support data-driven initiatives. You will collaborate with data engineers, analysts, cross-functional teams, and stakeholders to ensure the data platform meets the organiza...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a GCP Data Engineer-Technical Lead at Birlasoft Office in Bengaluru, India, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms on Google Cloud Platform (GCP) to support business intelligence, analytics, and machine learning initiatives. With a primary focus on Python and GCP technologies such as BigQuery, Dataproc, and Data Flow, you will develop ETL and ELT pipelines while ensuring optimal data manipulation and performance tuning. Your role will involve leveraging data manipulation libraries like Pandas, NumPy, and PySpark, along with SQL expertise for efficient data processing in BigQuery. Additionally, your experience with tools such ...

Posted 2 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are looking for a GCP Cloud Engineer for a position based in Pune. As a GCP Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions on Google Cloud Platform. Your expertise in GCP services, solution design, and programming skills will be crucial for developing scalable and efficient cloud solutions. Your key responsibilities will include designing and implementing GCP-based data solutions following best practices, developing workflows and pipelines using Cloud Composer and Apache Airflow, building and managing data processing clusters using Dataproc, working with GCP services like Cloud Functions, Cloud Run, and Cloud Storage, and integrating mul...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer at our company, you will be responsible for designing scalable and robust AI/ML systems in production, focusing on high-performance and cost-effective solutions. Your expertise in various technologies, including GCP services like BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage, along with programming languages such as Python, Java/Scala, and SQL, will be crucial for the success of our projects. Additionally, your experience with data processing tools like Apache Beam, Apache Kafka, and Cloud Dataprep, as well as orchestration tools like Apache Airflow and Terraform, will play a significant role in implementing efficient data pipelines. Knowledge of security ...

Posted 2 months ago

AI Match Score
Apply

14.0 - 20.0 years

0 Lacs

maharashtra

On-site

As a Principal Architect - Data & Cloud at Quantiphi, you will bring your 14-20 years of experience in Technical, Solutioning, and Analytical roles to lead the way in architecting, designing, and implementing end-to-end data pipelines and data integration solutions for structured and unstructured data sources and targets. With a focus on Cloud platforms such as GCP, AWS, and Azure, you will be responsible for building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration, and Business Intelligence/Artificial Intelligence solutions. Your role will involve understanding business requirements and translating them into functional and non-functional areas, defining boundaries ...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative, and scientific capabilities, collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow, informed and validated by science and data, superpowered by creativity and design, all underpinned by technology created with purpose. Your role involves having IT experience with a minimum of 5+ years in creating data warehouses, data lakes, ETL/ELT, data pipelines on cloud. You should have experience in data pipeline implementation with cloud providers such as AWS, Azure, ...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Technology Service Specialist, AVP at our Pune location, you will be an integral part of the Technology, Data, and Innovation (TDI) Private Bank team. In this role, you will be responsible for providing 2nd Level Application Support for business applications used in branches, by mobile sales, or via the internet. Your expertise in Incident Management and Problem Management will be crucial in ensuring the stability of these applications. Partnerdata, the central client reference data system in Germany, is a core banking system that integrates many banking processes and applications through numerous interfaces. With the recent migration to Google Cloud (GCP), you will be involved in opera...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking experienced and talented engineers to join our team. Your main responsibilities will include designing, building, and maintaining the software that drives the global logistics industry. WiseTech Global is a leading provider of software for the logistics sector, facilitating connectivity for major companies like DHL and FedEx within their supply chains. Our organization is product and engineer-focused, with a strong commitment to enhancing the functionality and quality of our software through continuous innovation. Our primary Research and Development center in Bangalore plays a pivotal role in our growth strategies and product development roadmap. As a Lead Software Engineer, ...

Posted 3 months ago

AI Match Score
Apply

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational eff...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Warehouse (DWH) professional with relevant experience in Google Cloud Platform (GCP), you will be responsible for developing and implementing robust data architectures. This includes designing data lakes, data warehouses, and data marts by utilizing GCP services such as BigQuery, Dataflow, DataProc, and Cloud Storage. Your role will involve designing and implementing data models that meet business requirements while ensuring data integrity, consistency, and accessibility. Your deep understanding of GCP services and best practices for data warehousing, data analytics, and machine learning will be crucial in this role. You will also be tasked with planning and executing data migratio...

Posted 3 months ago

AI Match Score
Apply

6.0 - 8.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Job Summary Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise, you'll enable data-driven decision-making, contribute to strategic business initiatives, and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth. Software Requirements Required: Proficiency in Data Engineering tools and frameworks such as Hive , Apache Spark , and Python (version 3.x)...

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

13 - 18 Lacs

Pune

Work from Office

About The Role : Job Title Technical-Specialist Big Data (PySpark) Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Python and Spark technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and ...

Posted 3 months ago

AI Match Score
Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

About The Role : Job TitleSenior Engineer PD, AVP LocationPune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motiv...

Posted 3 months ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

We are looking for a skilled Data Governance Engineer to take charge of developing and overseeing robust data governance frameworks on Google Cloud Platform (GCP). Your role will involve leveraging your expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure the implementation of high-quality, secure, and compliant data practices aligned with organizational objectives. With a minimum of 4 years of experience in data governance, data management, or data security, you should possess hands-on proficiency with Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, Dataproc, and Google Data Catalog. Additionally, a strong command over...

Posted 3 months ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

haryana

On-site

Join GlobalLogic as a valuable member of the team working on a significant software project for a world-class company that provides M2M / IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. Your engagement will involve contributing to the development of end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Requirements - BA / BS degree in Computer Science, Mathematics, or a related technical field, or equivalent practical experience. - Proficiency in Cloud SQL and Cloud Bigtable. - Experience with Dataflow, BigQuery, Dataproc, Dat...

Posted 3 months ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for a skilled Data Governance Engineer to spearhead the development and supervision of robust data governance frameworks on Google Cloud Platform (GCP). You should have a deep understanding of data management, metadata frameworks, compliance, and security within cloud environments to ensure the adoption of high-quality, secure, and compliant data practices aligned with organizational objectives. The ideal candidate should possess: - Over 4 years of experience in data governance, data management, or data security. - Hands-on expertise with Google Cloud Platform (GCP) tools like BigQuery, Dataflow, Dataproc, and Google Data Catalog. - Proficiency in metadata management, data lin...

Posted 3 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies