665 Dataproc Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

4 - 8 Lacs

Noida

Work from Office

We are looking for a skilled Senior Azure Data Engineer with 5 to 10 years of experience to design and implement scalable data pipelines using Azure technologies, driving data transformation, analytics, and machine learning. The ideal candidate will have a strong background in data engineering and proficiency in Python, PySpark, and Spark Pools. Roles and Responsibility Design and implement scalable Databricks data pipelines using PySpark. Transform raw data into actionable insights through data analysis and machine learning. Build, deploy, and maintain machine learning models using MLlib or TensorFlow. Optimize cloud data integration from Azure Blob Storage, Data Lake, and SQL/NoSQL sources...

Posted 4 months ago

AI Match Score
Apply

10.0 - 20.0 years

12 - 22 Lacs

Pune

Work from Office

Your key responsibilities Ensures that the Service Operations team provides optimum service level to the business lines it supports. Takes overall responsibility for the resolution of incidents and problems within the team. Oversees the resolution of complex incidents. Ensure that Analysts apply the right problem-solving techniques and processes. Assists in managing business stakeholder relationships. Assists in defining and managing OLAs with relevant stakeholders. Ensures that the team understands OLAs and resources appropriately and are aligned to business SLAs. Ensures relevant Client Service teams are informed of progress on incidents, where necessary. Ensures that defined divisional Pr...

Posted 4 months ago

AI Match Score
Apply

5.0 - 8.0 years

4 - 7 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 8 , jd= Job Title:-GCP Admin Job Location:- Remote Job Type:- Full Time JD:- Responsibilities— Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access — Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. — Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP — Create automations and monit...

Posted 4 months ago

AI Match Score
Apply

3.0 - 7.0 years

25 - 35 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled GCP Data Warehouse Engineer to join our data team. You will be responsible for designing, developing, and maintaining scalable and efficient data warehouse solutions on Google Cloud Platform (GCP) . Your work will support analytics, reporting, and data science initiatives across the company. Key Responsibilities: Design, build, and maintain data warehouse solutions using BigQuery . Develop robust and scalable ETL/ELT pipelines using Dataflow , Cloud Composer , or Cloud Functions . Implement data modeling strategies (star schema, snowflake, etc.) to support reporting and analytics. Ensure data quality, integrity, and security across all pipelines and storage. O...

Posted 4 months ago

AI Match Score
Apply

3.0 - 8.0 years

11 - 16 Lacs

Pune

Work from Office

Job Title Lead Engineer Location Pune Corporate Title Director As a lead engineer within the Transaction Monitoring department, you will lead and drive forward critical engineering initiatives and improvements to our application landscape whilst supporting and leading the engineering teams to excel in their roles. You will be closely aligned to the architecture function and delivery leads, ensuring alignment with planning and correct design and architecture governance is followed for all implementation work. You will lead by example and drive and contribute to automation and innovation initiatives with the engineering teams. Join the fight against financial crime with us! Your key responsibi...

Posted 4 months ago

AI Match Score
Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love colla...

Posted 4 months ago

AI Match Score
Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It en...

Posted 4 months ago

AI Match Score
Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love colla...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 14 Lacs

Noida

Hybrid

Data Engineer (SaaS-Based). Immediate Joiners Preferred. Shift : 3 PM to 12 AM IST. Good to have : GCP Certified Data Engineer. Overview Of The Role. As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape.. Required Skills: 5+ years of industry experience in software development, data en...

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

4 - 8 Lacs

Pune

Hybrid

Should be capable of developing/configuring data pipelines in a variety of platforms and technologies Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage Have experience with creating solutions which power AI/ML models and generative AI Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineerin...

Posted 4 months ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Mandatory skill ETL_GCP_Bigquery Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. Work extensively with BigQuery for data processing, querying, and optimization. Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. Debug technical issues, perform root cause analysis, and provide solutions for production incidents. Ensure data quality, accuracy, and integrity across data pipelines. Collaborate with cross-functional teams to define technical requirements and deliver solutions. Work ind...

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

10 - 17 Lacs

Chennai

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using BigQuery, Dataproc, PubSub, and Cloud Storage on Google Cloud Platform (GCP). Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to data pipeline failures or errors in real-time using logs analysis and debugging techniques. Develop automation scripts using Python to streamline data processing tasks and improve efficiency. Ensure compliance with security standards by implementing access controls, encryption, and monitoring mechanisms.

Posted 4 months ago

AI Match Score
Apply

6.0 - 11.0 years

9 - 13 Lacs

Hyderabad

Work from Office

GCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Experienced inGCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Good experience in building the pipeline ofGCPComponents to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently

Posted 4 months ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Solution Design & Architecture Implementation & Deployment Technical Leadership & Guidance Client Engagement & Collaboration Performance Monitoring & Optimization Your Profile Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3-8 years of experience in designing, implementing, and managing data solutions. 3-8 years of hands-on experience working ...

Posted 4 months ago

AI Match Score
Apply

8.0 - 13.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git fo...

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Remote

Job Title: Software Engineer GCP Data Engineering Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are seeking a Software Engineer with a strong background in GCP Data Engineering and a solid understanding of how to build scalable data processing frameworks. The ideal candidate will be proficient in data ingestion, transformation, and orchestration using modern cloud-native tools and technologies. This role requires hands-on experience in designing and optimizing ETL pipelines, managing big data workloads, and supporting data quality initiatives. Key Responsibilities: Design and develop scalable data processing solutions using Apache Beam, Spark, a...

Posted 4 months ago

AI Match Score
Apply

6.0 - 11.0 years

6 - 9 Lacs

Hyderabad

Work from Office

At least 8 + years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc . At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines . Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc . Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools ) Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies . Design should help embed standard pr...

Posted 4 months ago

AI Match Score
Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate f...

Posted 4 months ago

AI Match Score
Apply

4.0 - 9.0 years

15 - 19 Lacs

Pune

Work from Office

: Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the t...

Posted 4 months ago

AI Match Score
Apply

4.0 - 8.0 years

10 - 15 Lacs

Pune

Work from Office

: Job Title - GCP - Senior Engineer - PD Location - Pune Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidat...

Posted 4 months ago

AI Match Score
Apply

3.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate f...

Posted 4 months ago

AI Match Score
Apply

8.0 - 13.0 years

35 - 50 Lacs

Hyderabad

Hybrid

Location: Hyderabad Exp: 8+ Years Immediate Joiners Preferred We at Datametica Solutions Private Limited are looking for a GCP Data Architect who has a passion for cloud, with knowledge and working experience of GCP Platform. This role will involve understanding business requirements, analyzing technical options and providing end to end Cloud based ETL Solutions. Required Past Experience: 10 + years of overall experience in architecting, developing, testing & implementing Big data projects using GCP Components (e.g. BigQuery, Composer, Dataflow, Dataproc, DLP, BigTable, Pub/Sub, Cloud Function etc.). Experience and understanding on ETL - AB initio Minimum 4 + years experience with data manag...

Posted 4 months ago

AI Match Score
Apply

7.0 - 12.0 years

20 - 35 Lacs

Bengaluru

Work from Office

Job Description GCP Lead Google Cloud Platform Location: Brookefield, Bangalore, India Department: Software Development Legal Entity: FGSI Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Job Summary We are seeking a passionate and technically strong GCP Lead to join our Global Technology team at Fossil Group . This role is responsib...

Posted 4 months ago

AI Match Score
Apply

2.0 - 5.0 years

18 - 21 Lacs

Hyderabad

Work from Office

Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design, and development of software products as well as research and evaluation of new technical solutions. Responsibilities Design, build, test and deploy scalable and reusable systems that handle large amounts of...

Posted 4 months ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies