1594 Data Flow Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

4 - 9 Lacs

karnataka

Work from Office

Description Skills: Proficiency in SQL is a must. PL/SQL to understand integration SP part. Experience in PostgreSQL is must. Basic knowledge of Google Cloud Composer ( or Apache Airflow). Composer is managed GCP service for Apache Airflow. All pipelines are orchestrated and scheduled through Composer GCP basics-high level understanding of using GCP UI and services like Cloud SQL PostgreSQL Cloud Composer Cloud Storage Dataproc Airlfow DAGs are written in Python basic knowledge of Python code for DAGs Dataproc is Managed Spark in GCP so a bit of PySpark knowledge is also nice to have. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Def...

Posted 2 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

10 - 15 Lacs

pune

Work from Office

Job Description: Job Title - GCP - Senior Engineer - PD Location - Pune Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very mot...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

35 - 40 Lacs

bengaluru

Work from Office

About The Role : Job Title Cloud Data and AI Engineer, AVP LocationBangalore, India Role Description We are seeking an experienced Cloud (GCP) Data and AI engineer to play a key role in development and implementation of AI and Data engineering solutions. Candidate will work as an Engineering Team member in collaboration with business function and will be responsible forbuilding and deploying robust and scalable AI and data engineering solutions, in-line with banks AI strategy, driven by business value to improve operational efficiency and enhance customer experience. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We comp...

Posted 2 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

13 - 17 Lacs

bengaluru

Work from Office

For one of our clients in the Insurance Segment, we are looking for a hybrid Software Developer with expertise in either Core Java or Mainframe technologies (COBOL, JCL, DB2), combined with experience in data integration using Informatica IDMC. This role will involve both application development and building/maintaining data integration workflows across enterprise systems. Responsibilities Application Development (60-70%): Design, develop, test, and maintain applications using either: Java (Java 8+, JDBC, multithreading, Spring Boot), OR Mainframe technologies (COBOL, JCL, DB2). Implement business logic, batch processes, and integration routines. Participate in system design discussions, cod...

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 6.0 years

6 - 10 Lacs

noida

Work from Office

We are seeking a highly skilled and experienced Senior Data Analyst to join our dynamic analytics team. The ideal candidate will have a strong background in data analysis, business intelligence, and cloud data platforms, with hands-on experience in Snowflake , Sigma BI , and SQL. You will play a key role in transforming data into actionable insights that drive strategic decision-making across the organization. Key Responsibilities: Design, develop, and maintain dashboards and reports using Sigma BI and other visualization tools. Extract, transform, and load (ETL) data from various sources into Snowflake data warehouse. Collaborate with cross-functional teams to understand business requiremen...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 35 Lacs

gurugram

Remote

Role: GCP Specialist Openings: 20 Location: Pan India (Remote) Experience Required: 5- 6 years relevant in GCP projects About the Role We are hiring experienced GCP Specialists to join our cloud data engineering team working on cutting-edge global projects. The ideal candidate will have strong expertise in Google Cloud Platform (GCP) , with experience in DataProc, BigQuery, Dataflow, and Cloud Composer . If youre passionate about cloud data engineering and eager to make an impact with a leading organization — we’d love to connect! Key Skills Must Have: GCP (DataProc, BigQuery, Dataflow, Cloud Composer) Good to Have: AWS Cloud exposure Experience building and optimizing cloud data pipelines a...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 8.0 years

15 - 25 Lacs

pune, bengaluru

Hybrid

Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professio...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 35 Lacs

gurugram

Remote

Role: GCP Specialist Openings: 20 Location: Pan India (Remote) Experience Required: 5- 6 years relevant in GCP projects About the Role We are hiring experienced GCP Specialists to join our cloud data engineering team working on cutting-edge global projects. The ideal candidate will have strong expertise in Google Cloud Platform (GCP) , with experience in DataProc, BigQuery, Dataflow, and Cloud Composer . If youre passionate about cloud data engineering and eager to make an impact with a leading organization — we’d love to connect! Key Skills Must Have: GCP (DataProc, BigQuery, Dataflow, Cloud Composer) Good to Have: AWS Cloud exposure Experience building and optimizing cloud data pipelines a...

Posted 2 weeks ago

AI Match Score
Apply

6.0 - 9.0 years

27 - 42 Lacs

chennai

Work from Office

Job Description Role: Looker + GCP Experience: 6 to 9 years Summary : We are seeking a skilled Looker Developer with strong experience in Google Cloud Platform (GCP) to design, build, and optimize scalable data models, dashboards, and analytics solutions. The ideal candidate should be proficient in LookML, data visualization, and GCP data services such as BigQuery, Cloud Storage, and Dataflow. Key Responsibilities: • Develop and maintain Looker dashboards, Looks, and Explores to provide business insights. • Create and optimize LookML models, views, and derived tables. • Collaborate with business and data engineering teams to understand reporting needs and translate them into scalable BI solu...

Posted 2 weeks ago

AI Match Score
Apply

6.0 - 11.0 years

6 - 10 Lacs

bengaluru

Work from Office

a . 2 + years of experience in bigdata technologies like Pyspark , Hadoop , Trino , druid b. Strong experience on query Optimisation in Trino /Pyspark c. Strong hands on in Airflow / scheduler d. Expertise in python About the Role: Hands-on Data Engineers to build and maintain scalable data solutions and services . The role includes : a. Maintain , develop data engineering pipelines to ensure seamless data flow for BI applications b. Create data models to ensure seamless query system c. Develop or onboard opensource tools to make the data platform up to date d. Optimize queries and scripts over large-scale datasets (TBs) with a focus on performance and cost-efficiency e. Implement data gover...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 32 Lacs

chennai

Hybrid

Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python and understanding of ETLprocesses.

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 32 Lacs

mumbai

Hybrid

Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python and understanding of ETLprocesses.

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 32 Lacs

bengaluru

Hybrid

Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python and understanding of ETLprocesses.

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 32 Lacs

chennai

Work from Office

• Strong proficiency in GCP services such as Dataflow, Big Query, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python for data manipulation.

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 32 Lacs

pune

Work from Office

• Strong proficiency in GCP services such as Dataflow, Big Query, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python for data manipulation.

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 32 Lacs

hyderabad

Work from Office

• Strong proficiency in GCP services such as Dataflow, Big Query, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python for data manipulation.

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

17 - 27 Lacs

chennai

Work from Office

• Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python for data manipulation and analysis

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

17 - 27 Lacs

bengaluru

Work from Office

• Strong proficiency in GCP services such as Dataflow, BigQuery, and Pub/Sub. • Hands-on experience with Terraform for provisioning and managing GCP infrastructure. • Proficiency in SQL and Python for data manipulation and analysis

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

15 - 25 Lacs

chennai

Work from Office

Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

15 - 25 Lacs

pune

Work from Office

Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

15 - 25 Lacs

bengaluru

Work from Office

Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL

Posted 2 weeks ago

AI Match Score
Apply

7.0 - 12.0 years

20 - 35 Lacs

pune

Work from Office

Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation

Posted 2 weeks ago

AI Match Score
Apply

7.0 - 12.0 years

20 - 35 Lacs

bengaluru

Work from Office

Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation

Posted 2 weeks ago

AI Match Score
Apply

7.0 - 12.0 years

20 - 35 Lacs

hyderabad

Work from Office

Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

6 - 10 Lacs

mumbai, pune, chennai

Work from Office

We''re seeking a talented Data Engineer with hands-on experience in the Google Cloud data ecosystem and a proven track record of working with Vector Database, Knowledge Graphs like Neo4j and StarDog. You''ll be instrumental in designing, building, and maintaining our data infrastructure and pipelines, enabling critical insights and supporting data-driven initiatives across the organization. Responsibilities Data Pipeline Development: Design, build, and optimize robust and scalable data pipelines to ingest, transform, and load data from various sources into our data warehouse and knowledge graphs. Cloud Data Stack Expertise: Implement and manage data solutions using Google Cloud Platform (GCP...

Posted 2 weeks ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies