73 Apache Beam Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

12 - 18 Lacs

hyderabad

Hybrid

Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: Proficiency in ETL, Batch, and Streaming Process Experience with BigQuery, Cloud Storage, and CloudSQL Strong programming skills in Python, SQL, and Apache Beam for data processing Understanding of data mod...

Posted 22 hours ago

AI Match Score
Apply

3.0 - 5.0 years

10 - 15 Lacs

bengaluru

Hybrid

Must to Have Skill Python, Apache (Beem developer) templates to be built by scratch, knowledge on cloud composed and SQL Good to Have Skill Healthcare projects and GCP Healthcare API

Posted 2 days ago

AI Match Score
Apply

6.0 - 10.0 years

20 - 25 Lacs

hyderabad, chennai

Hybrid

Role & responsibilities Design, build, and maintain data pipelines on Google Cloud Platform using Dataflow (Apache Beam) . Develop and manage orchestration workflows using Cloud Composer (Airflow) . Ingest, transform, and process large-scale data with a focus on performance, scalability, and compliance. Collaborate with business analysts and healthcare SMEs to understand workflows and translate them into data solutions. Optimize data pipelines for cost efficiency, performance, and scalability. Ensure data quality, lineage, and governance across claims datasets. Integrate structured and unstructured data sources into data lakes/warehouses . Implement data security and HIPAA compliance standar...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Role Overview: You will be responsible for building end-to-end data applications by integrating backend APIs with analytical front-ends. With over 3 years of IT experience, you are expected to have a good understanding of analytics tools for effective data analysis. Your ability to learn new tools and technologies will be crucial for this role. You should have prior experience working with at least one Structural (SQL/Oracle/Postgres) and one NoSQL database. A strong understanding of Data Warehouse (DW), Data Mart, and Data Modelling concepts is essential, and you should have been part of a Data Warehouse design team in at least one project. Key Responsibilities: - Develop high performance a...

Posted 1 week ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As an experienced IT professional with over 2 years of experience, you should have a good understanding of analytics tools to effectively analyze data. You should also possess the ability to learn new tools and technologies. Your previous work experience should include working with at least one Structural database (such as SQL, Oracle, or Postgres) and one NoSQL database. It is essential to have a strong understanding of Data Warehouse (DW), Data Mart, and Data Modelling concepts. You should have been a part of a Data Warehouse design team in at least one project. Key Responsibilities: - Be aware of design best practices for OLTP and OLAP systems - Have exposure to load testing methodologies...

Posted 1 week ago

AI Match Score
Apply

5.0 - 9.0 years

20 - 30 Lacs

pune

Hybrid

-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

20 - 30 Lacs

bengaluru

Hybrid

-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

20 - 30 Lacs

hyderabad

Hybrid

-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 8.0 years

10 - 20 Lacs

bengaluru

Work from Office

Job Title: GCP Data Engineer Work Mode: Onsite Base Location: Bangalore Experience Required: 3+ Years Employment Type: Direct Payroll with Client Job Summary: We are seeking a skilled GCP Data Engineer with strong expertise in Python and hands-on experience in designing and developing scalable data pipelines on Google Cloud Platform (GCP) . The ideal candidate should have a solid understanding of BigQuery , ETL/ELT processes , and data modeling , along with exposure to core GCP data services such as Dataflow , Pub/Sub , and Dataproc . Key Responsibilities: Design, develop, and maintain scalable data pipelines using Python, BigQuery, and GCP Dataflow. Implement efficient data models, partitio...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a GCP Data Engineer, you will be responsible for designing and implementing solutions on Google Cloud Platform (GCP) utilizing various GCP components. Key Responsibilities: - Implementing and architecting solutions on GCP using components such as BigQuery, SQL, Cloud Composer/Python, Cloud Functions, Dataproc with PySpark, Python Injection, Dataflow with PUB/SUB. - Experience with Apache Beam, Google Dataflow, and Apache Spark in creating end-to-end data pipelines. - Proficiency in Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, and Machine Learning. - Programming expertise in Java, Python, and other relevant technologies. - Certified in Googl...

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. We are counting on your unique voice and perspective to help EY become even better. Join us to build an exceptional experience for yourself and contribute to creating a better working world for all. Key Responsibilities: - Develop, deploy, and monitor machine learning models in production environments. - Automate ML pipelines for model training, validation, and deployment. - Optimize ML model performance, scalability, and cost efficiency. - Implement CI/CD workflows for ML model versioning, testing, and deplo...

Posted 2 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

Role Overview: You will be responsible for driving the IT strategy to create value across the organization as a GCP Big Query Engineer at Tredence. Your role involves implementing both low-level, innovative solutions and day-to-day tactics to enhance efficiency, effectiveness, and value. Your analytical skills will be key in providing critical content for decision-making and successful collaboration with business stakeholders. Key Responsibilities: - Implement and architect solutions on the Google Cloud Platform (GCP) using components of GCP - Utilize Apache Beam, Google Dataflow, and Apache Spark to create end-to-end data pipelines - Work with technologies such as Python, Hadoop, Spark, SQL...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a GCP Data Engineer, your role will involve the following key responsibilities and qualifications: Role Overview: - You should have experience in Java and/or Kotlin - Hands-on experience with GCP is required - Proficiency in PostgresSQL for database management and Dataflow for data processing - Familiarity with GraphQL for API development - Experience with GKE, Kubernetes, and Docker for managing runtime environments - Knowledge of Confluent Kafka and Schema Registry - Preferable experience within Apache Beame - Previous exposure to Data Engineering and the Retail industry - Ability to work on data pipelines, processing, design, and development - A DevOps mindset would be considered a plu...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

19 - 22 Lacs

bengaluru

Work from Office

Job Summary As a key member of the Data Team at Equinix, we are seeking an experienced GCP Data Engineer who will lead end-to-end development of complex Data Engineering use cases and drive the evolution of Equinix's Data Lake platform You will design and build enterprise-scale data infrastructure and analytics solutions on Google Cloud Platform while providing technical mentorship to the data engineering team The ideal candidate combines deep technical expertise in cloud-native data technologies with proven leadership skills and a passion for building robust, scalable data platforms that drive strategic business insights Responsibilities Participate in design and implementation of enterpris...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

delhi

On-site

As an Engineer with 5-8 years of experience, your role will involve working with competencies like Apache Beam, GCP Cloud, and Open Shift. You should have hands-on experience in Java and Kafka Streaming, with knowledge in GCP and Kubernetes being a plus. Your responsibilities will include developing and deploying data and analytics-led solutions on GCP, as well as designing highly available and scalable systems. You will be working on Data Engineering solutions using Cloud BigQuery, Cloud DataFlow, Cloud BigTable, Storage, Cloud Spanner, and Cloud IAM. It's essential to have a good understanding of Apache Kafka and proficiency in cloud-based ETL/Data Orchestration tools like Apache Beam and ...

Posted 1 month ago

AI Match Score
Apply

3.0 - 8.0 years

0 Lacs

coimbatore, tamil nadu

On-site

Role Overview: At Techjays, we are on a mission to empower businesses worldwide by building AI solutions that drive industry transformation. We are seeking a skilled Data Analytics Engineer with 3 to 8 years of experience to join our global team in Coimbatore. As a Data Analytics Engineer at Techjays, you will play a crucial role in designing and delivering scalable, data-driven solutions that support real-time decision-making and deep business insights. Your primary focus will be on developing and maintaining ETL/ELT data pipelines, collaborating with various teams, designing interactive dashboards, and ensuring high reliability of data pipelines. If you are passionate about using tools lik...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

You will be responsible for the following tasks: - Experience in GCP. - Data migration experience from legacy systems including SQL, Oracle. - Experience with Data lake, data warehouse ETL pipelines build and design on GCP. - GCP data and analytics services like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage & Cloud Functions. - Using Cloud Native GCP CLI/gsutil along with scripting using Python & SQL. - Experience with Data Governance, Metadata Management, Data Masking & Encryption using GCP tools like Cloud Data Catalog, GCP KMS tools. No additional details about the company are mentioned in the job description.,

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

15 - 25 Lacs

pune, chennai, delhi / ncr

Work from Office

Location- Chennai / HYD / Kolkata / Delhi / Pune / BLR Engineer having competencies like Apache Beam, GCP Cloud, Open Shift experience in Java and Kafka Streaming. GCP and Kubernetes Develop and Deploy data and analytics-led solutions on GCP

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

**Job Description** **Role Overview:** You will be joining the Commercial Bank Tribe, focusing on the special needs of small and medium enterprise clients in Germany, a designated area for further growth and investment within Corporate Bank. Your primary responsibility will involve the digital transformation of 800,000 clients in 3 brands through the establishment of the BizBanking platform, including the development of digital sales and service processes and automation of processes for this client segment. The tribe is on a journey of extensive digitalization of business processes and migration of applications to the cloud. You will work collaboratively in an agile setup with business colle...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Role Overview: Arista Networks, a leading industry expert in data-driven networking solutions, is looking to grow the Wi-Fi Team at the Pune Development Center. As a Software Engineer at Arista, you will collaborate with the Wi-Fi Data team within the broader Software Engineering team. Your role will be essential in enhancing Arista's Cognitive Wi-Fi solution by managing cloud-based data effectively. This is a great opportunity to be a part of a small and innovative team within Arista. Key Responsibilities: - Collaborate closely with Data Scientists to develop and maintain data and AI/ML pipelines at scale. - Perform tasks such as anomaly detection, root cause analysis, automatic remediation...

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

17 - 18 Lacs

pune

Work from Office

Hi all, We are hiring for the role GCP DATA ENGINEER Experience: 5+ Years Location: Pune Notice Period: Immediate - 15 days Budget: 18 LPA only Skills: Ideally you will have a background in software engineering along with experience in working in a complex cloud environment ( Google Cloud Platform) Ideally you will have experience working with Python and SQL. You have experience building data pipelines and ETL frameworks both batch and real time using Python and any of the GCP capabilities such Apache Beam, Data Flow, Data Fusion You have Experience with using Terraform to build Infrastructure-as-a-code Experience working with Big Data Technologies (Spark, Cloud SQL, Big Query ) preferably i...

Posted 1 month ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

kerala

On-site

As an application developer, you will be responsible for various tasks including unit testing, code deployment, and technical documentation. Your contributions will involve working on projects, estimating tasks, diagnosing performance issues, and ensuring code and processes are well-documented for ease of understanding by other developers. You will be developing high-scale applications from backend to UI layer with a focus on operational excellence, security, and scalability. Additionally, you will apply modern software development practices and collaborate across teams to integrate systems. Key Responsibilities: - Perform general application development activities such as unit testing, code...

Posted 1 month ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Machine Learning Engineer in this role, you will be responsible for working closely with customers, product teams, and engineering to achieve the following key responsibilities: - Onboard new clients and configure solutions to meet their data and business requirements. - Validate data quality and integrity to ensure accurate results. - Deploy and monitor machine learning models in production environments. - Execute existing ML pipelines for training new models and evaluating their quality. - Interpret model performance metrics and provide valuable insights to customers and internal teams. - Clearly communicate technical concepts to non-technical stakeholders. - Offer actionable feedback...

Posted 1 month ago

AI Match Score
Apply

5.0 - 10.0 years

0 - 2 Lacs

bengaluru

Hybrid

Experience in GCP Airflow Dataflow Apache Beam Mandatory.Implementation on pipeline ETL using Dataflow Apache beam Docker Jenkins SonarQube JIRA Nexus Confluence GIT Bit Bucket Maven Gradle Run Deck API Development using Java Microservices SpringBoot

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will have the opportunity to work at Capgemini, a company that empowers you to shape your career according to your preferences. You will be part of a collaborative community of colleagues worldwide, where you can reimagine what is achievable and contribute to unlocking the value of technology for leading organizations to build a more sustainable and inclusive world. Your Role: - You should have a very good understanding of current work, tools, and technologies being used. - Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python are required. - Experience with Fact and Dimension tables, SCD is necessary. - Minimum 3 years of experience in GCP Data Enginee...

Posted 1 month ago

AI Match Score
Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies