38 Gcp Dataflow Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

15 - 27 Lacs

kolkata, hyderabad, bengaluru

Hybrid

We have an opportunity GCP Data Engineer+ Snowflake in PWC AC Position: GCP Data Engineer+ Snowflake Experience Required: 4-8 Years Notice Period: Immediate to 60 Days Locations: Bangalore, Hyderabad, Kolkata, Chennai, Pune, Gurgaon & Mumbai Work Mode : Hybrid Must Have Skills: 1) GCP 2) GCP Data Services 3) Snowflake data warehousing, including SQL, Snow pipe 4) GCP Data flow, Cloud Composer GCP services (e.g., Cloud Dataflow, Cloud Composer/Airflow, Pub/Sub) to ingest, transform, and load data into Snowflake. Proficiency in GCP data services (e.g., Big Query, Cloud Storage, Dataflow, Cloud Composer, Pub/Sub). Strong expertise in Snowflake data warehousing, including SQL, Snowpipe, Streams,...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Sabre is a technology company that powers the global travel industry. Leveraging next-generation technology, we create global solutions that address significant opportunities and challenges within the travel sector. Positioned at the heart of the travel industry, we drive innovation by providing advancements that lead to a more connected and seamless ecosystem. Our technology powers various platforms including mobile apps, online travel sites, airline and hotel reservation networks, and more, ultimately connecting people with meaningful experiences. As a global leader in innovative technology within the travel industry, Sabre is constantly seeking talented individuals with a passion for tech...

Posted 2 months ago

AI Match Score
Apply

6.0 - 10.0 years

30 - 35 Lacs

bengaluru

Work from Office

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Op...

Posted 2 months ago

AI Match Score
Apply

0.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to...

Posted 2 months ago

AI Match Score
Apply

12.0 - 15.0 years

0 - 20 Lacs

noida

Work from Office

Roles and Responsibilities : Design, develop, test, deploy, and maintain large-scale data pipelines using GCP Data Flow. Collaborate with cross-functional teams to gather requirements and design solutions for complex data processing needs. Develop automated testing frameworks to ensure high-quality delivery of data products. Troubleshoot issues related to pipeline failures or errors in a timely manner. Job Requirements : 12-15 years of experience in software development with expertise in data engineering on Google Cloud Platform (GCP). Strong understanding of GCP cloud storage services such as BigQuery, Cloud Storage Bucket, etc. Experience with cloud orchestration tools like Kubernetes Engi...

Posted 3 months ago

AI Match Score
Apply

10.0 - 18.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a seasoned Senior Data Architect with extensive knowledge in Databricks and Microsoft Fabric to join our team. In this role, you will be responsible for leading the design and implementation of scalable data solutions for BFSI and HLS clients. As a Senior Data Architect specializing in Databricks and Microsoft Fabric, you will play a crucial role in architecting and implementing secure, high-performance data solutions on the Databricks and Azure Fabric platforms. Your responsibilities will include leading discovery workshops, designing end-to-end data pipelines, optimizing workloads for performance and cost efficiency, and ensuring compliance with data governance, security...

Posted 3 months ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The role of warehousing and logistics systems is becoming increasingly crucial in enhancing the competitiveness of various companies and contributing to the overall efficiency of the global economy. Modern intra-logistics solutions integrate cutting-edge mechatronics, sophisticated software, advanced robotics, computational perception, and AI algorithms to ensure high throughput and streamlined processing for critical commercial logistics functions. Our Warehouse Execution Software is designed to optimize intralogistics and warehouse automation by utilizing advanced optimization techniques. By synchronizing discrete logistics processes, we have created a real-time decision engine that maximi...

Posted 3 months ago

AI Match Score
Apply

4.0 - 7.0 years

4 - 7 Lacs

Bengaluru, Karnataka, India

On-site

Line of Service: Advisory Industry/Sector: Not Applicable Specialism: Data, Analytics & AI Management Level: Senior Associate Job Description & Summary: At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilize advanced analytics techniques to help clients optimize their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimize business performance and enhance competitive advantage. Why PwC: At PwC, you will ...

Posted 4 months ago

AI Match Score
Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Op...

Posted 4 months ago

AI Match Score
Apply

6.0 - 10.0 years

6 - 10 Lacs

Mumbai, Maharashtra, India

On-site

KEY ACCOUNTABILITIES 70%of Time- Excellent Technical Work Design, develop, and optimize data pipelines and ETL/ELT workflows using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) Build and maintain data architecture that supports structured and unstructured data from multiple sources Work closely with statisticians and data scientists to provision clean, transformed datasets for advanced modeling and analytics Enable self-service BI through efficient data modeling and provisioning in tools like Looker, Power BI, or Tableau Implement data quality checks, monitoring, and documentation to ensure high data reliability and accuracy Collaborate with DevOps/Cloud teams to ensure d...

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Gurugram

Work from Office

In one sentence We are seeking a skilled Database Migration Specialist with deep expertise in mainframe modernization and data migration to cloud platforms such as AWS, Azure, or GCP. The ideal candidate will have hands-on experience migrating legacy systems (COBOL, DB2, IMS, VSAM, etc.) to modern cloud-native databases like PostgreSQL, Oracle, or NoSQL. What will your job look like? Lead and execute end-to-end mainframe-to-cloud database migration projects. Analyze legacy systems (z/OS, Unisys) and design modern data architectures. Extract, transform, and load (ETL) complex datasets ensuring data integrity and taxonomy alignment. Collaborate with cloud architects and application teams to en...

Posted 5 months ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP...

Posted 5 months ago

AI Match Score
Apply

3.0 - 8.0 years

0 - 0 Lacs

hyderabad

Work from Office

Hiring for GCP Cloud Engineer , GCP Data Engineer We are Looking for 3+ years of Experience Skills - Airflow , GCP Cloud , Hadoop , SQL , ETL , Python , Big Query We are Looking for Immediate Joiners ( 15 - 30 Days )

Posted Date not available

AI Match Score
Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies