Jobs
Interviews

31 Apache Beam Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP services. Work closely with data analysts, data architects, and product teams to gather and understand data requirements. Manage and monitor BigQuery datasets, tables, and partitioning strategies. Implement error handling, resiliency, and observability mechanisms across pipeline components. Collaborate with DevOps teams to enable automated delivery (CI/CD) for data pipeline components. Required Skills: 5+ years of hands-on experience in Data Engineering or Software Engineering . Proficiency in Python and SQL . Good understanding of Java (for reading or modifying codebases). Experience building ETL pipelines with Apache Beam and Google Cloud Dataflow . Hands-on experience with Apache Kafka for stream processing. Solid understanding of BigQuery and data modeling on GCP. Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Compose, etc.). Good to Have: Experience building reusable ETL libraries or framework components. Knowledge of data governance, data quality checks, and pipeline observability. Familiarity with Apache Airflow or Cloud Composer for orchestration. Exposure to CI/CD practices in a cloud-native environment (Docker, Terraform, etc.). Tech stack : Python, SQL, Java, GCP (BigQuery, Pub/Sub, Cloud Storage, Cloud Compose, Dataflow), Apache Beam, Apache Kafka, Apache Airflow, CI/CD (Docker, Terraform)

Posted 1 month ago

Apply

2.0 - 3.0 years

4 - 7 Lacs

Hyderabad, Gachibowli

Work from Office

Job Summary Synechron is seeking a highly motivated and skilled Senior Cloud Data Engineer GCP to join our cloud solutions team. In this role, you will collaborate closely with clients and internal stakeholders to design, implement, and manage scalable, secure, and high-performance cloud-based data solutions on Google Cloud Platform (GCP). You will leverage your technical expertise to ensure the integrity, security, and efficiency of cloud data architectures, enabling the organization to derive maximum value from cloud data assets. This role contributes directly to our mission of delivering innovative digital transformation solutions and supports the organizations strategic objectives of scalable and sustainable cloud infrastructure. Software Requirements Required Skills: Proficiency with Google Cloud Platform (GCP) services (Compute Engine, Cloud Storage, BigQuery, Cloud Pub/Sub, Dataflow, etc.) Basic scripting skills with Python, Bash, or similar languages Familiarity with virtualization and cloud networking concepts Understanding of cloud security best practices and compliance standards Experience with infrastructure as code tools (e.g., Terraform, Deployment Manager) Strong knowledge of data management, data pipelines, and ETL processes Preferred Skills: Experience with other cloud platforms (AWS, Azure) Knowledge of SQL and NoSQL databases Familiarity with containerization (Docker, GKE) Experience with data visualization tools Overall Responsibilities Design, implement, and operate cloud data solutions that are secure, scalable, and optimized for performance Collaborate with clients and internal teams to identify infrastructure and data architecture requirements Manage and monitor cloud infrastructure and ensure operational reliability Resolve technical issues related to cloud data workflows and storage solutions Participate in project planning, timelines, and technical documentation Contribute to best practices and continuous improvement initiatives within the organization Educate and support clients in adopting cloud data services and best practices Technical Skills (By Category) Programming Languages: Essential: Python, Bash scripts Preferred: SQL, Java, or other data processing languages Databases & Data Management: Essential: BigQuery, Cloud SQL, Cloud Spanner, Cloud Storage Preferred: NoSQL databases like Firestore, MongoDB Cloud Technologies: Essential: Google Cloud Platform core services (Compute, Storage, BigQuery, Dataflow, Pub/Sub) Preferred: Cloud monitoring, logging, and security tools Frameworks & Libraries: Essential: Data pipeline frameworks, Cloud SDKs, APIs Preferred: Apache Beam, Data Studio Development Tools & Methodologies: Essential: Infrastructure as Code (Terraform, Deployment Manager) Preferred: CI/CD tools (Jenkins, Cloud Build) Security Protocols: Essential: IAM policies, data encryption, network security best practices Preferred: Compliance frameworks such as GDPR, HIPAA Experience Requirements 2-3 years of experience in cloud data engineering, cloud infrastructure, or related roles Hands-on experience with GCP is preferred; experience with AWS or Azure is a plus Background in designing and managing cloud data pipelines, storage, and security solutions Proven ability to deliver scalable data solutions in cloud environments Experience working with cross-functional teams on cloud deployments Alternative experience pathways: academic projects, certifications, or relevant internships demonstrating cloud data skills Day-to-Day Activities Develop and deploy cloud data pipelines, databases, and analytics solutions Collaborate with clients and team members to plan and implement infrastructure architecture Perform routine monitoring, maintenance, and performance tuning of cloud data systems Troubleshoot technical issues affecting data workflows and resolve performance bottlenecks Document system configurations, processes, and best practices Engage in continuous learning on new cloud features and data management tools Participate in project meetings, code reviews, and knowledge sharing sessions Qualifications Bachelors or Masters degree in computer science, engineering, information technology, or a related field Relevant certifications (e.g., Google Cloud Professional Data Engineer, Cloud Architect) are preferred Training in cloud security, data management, or infrastructure design is advantageous Commitment to professional development and staying updated with emerging cloud technologies Professional Competencies Critical thinking and problem-solving skills to resolve complex cloud architecture challenges Ability to work collaboratively with multidisciplinary teams and clients Strong communication skills for technical documentation and stakeholder engagement Adaptability to evolving cloud technologies and project priorities Organized with a focus on quality and detail-oriented delivery Proactive learner with a passion for innovation in cloud data solutions Ability to manage multiple tasks effectively and prioritize in a fast-paced environment

Posted 1 month ago

Apply

7.0 - 9.0 years

8 - 15 Lacs

Hyderabad

Hybrid

Role & Responsibilities Role Overview : We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements : • Proficiency in ETL, Batch, and Streaming Process • Experience with BigQuery, Cloud Storage, and CloudSQL • Strong programming skills in Python, SQL, and Apache Beam for data processing • Understanding of data modeling and schema design for analytics • Knowledge of data governance, security, and compliance in GCP • Familiarity with machine learning workflows and integration with GCP ML tools • Ability to optimize performance within data pipelines Functional Requirements : • Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features • Experience in leading and mentoring peers within an existing development team • Strong communication skills to craft and communicate robust solutions • Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations • Willingness to work on contemporary data architecture in Public and Private Cloud environments This role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification o Engineering Grad / Postgraduate CRITERIA o Proficient in ETL, Python, and Apache Beam for data processing efficiency. o Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. o Strong collaboration skills with cross-functional teams for data product development. o Comprehensive knowledge of data governance, security, and compliance in GCP. o Experienced in optimizing performance within data pipelines for efficiency.

Posted 1 month ago

Apply

6.0 - 9.0 years

7 - 14 Lacs

Hyderabad

Work from Office

Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: 1. Proficiency in ETL, Batch, and Streaming Process 2. Experience with BigQuery, Cloud Storage, and CloudSQL 3. Strong programming skills in Python, SQL, and Apache Beam for data processing 4. Understanding of data modeling and schema design for analytics 5. Knowledge of data governance, security, and compliance in GCP 6. Familiarity with machine learning workflows and integration with GCP ML tools 7. Ability to optimize performance within data pipelines Functional Requirements: 1. Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features 2. Experience in leading and mentoring peers within an existing development team 3. Strong communication skills to craft and communicate robust solutions 4. Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations 5. Willingness to work on contemporary data architecture in Public and Private Cloud environments T his role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements . Qualification Engineering Grad / Postgraduate CRITERIA 1. Proficient in ETL, Python, and Apache Beam for data processing efficiency. 2. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. 3. Strong collaboration skills with cross-functional teams for data product development. 4. Comprehensive knowledge of data governance, security, and compliance in GCP. 5. Experienced in optimizing performance within data pipelines for efficiency. 6. Relevant Experience: 6-9 years Connect at 9993809253

Posted 1 month ago

Apply

6.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Hybrid

Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: Proficiency in ETL, Batch, and Streaming Process Experience with BigQuery, Cloud Storage, and CloudSQL Strong programming skills in Python, SQL, and Apache Beam for data processing Understanding of data modeling and schema design for analytics Knowledge of data governance, security, and compliance in GCP Familiarity with machine learning workflows and integration with GCP ML tools Ability to optimize performance within data pipelines Functional Requirements: Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features Experience in leading and mentoring peers within an existing development team Strong communication skills to craft and communicate robust solutions Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations Willingness to work on contemporary data architecture in Public and Private Cloud environments This role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification Engineering Grad / Postgraduate CRITERIA Proficient in ETL, Python, and Apache Beam for data processing efficiency. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. Strong collaboration skills with cross-functional teams for data product development. Comprehensive knowledge of data governance, security, and compliance in GCP. Experienced in optimizing performance within data pipelines for efficiency. Relevant Experience: 6-9 years

Posted 1 month ago

Apply

8.0 - 10.0 years

40 - 45 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Roles & Responsibilities: Data Engineering Leadership & Strategy: Lead and mentor a team of data engineers, fostering a culture of technical excellence and collaboration. Define and implement data engineering best practices, standards, and processes. Data Pipeline Architecture & Development: Design, build, and maintain scalable, robust, and efficient data pipelines for ingestion, transformation, and loading of data from various sources. Optimize data pipelines for performance, reliability, and cost-effectiveness. Implement data quality checks and monitoring systems to ensure data integrity. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Cloud-Based Data Infrastructure: Design, implement, and manage cloud-based data infrastructure using platforms like AWS, Azure, or GCP. Leverage cloud services (e.g., data lakes, data warehouses, serverless computing) to build scalable and cost-effective data solutions. Leverage opensource airbyte , mage ai and similar Ensure data security, governance, and compliance within the cloud environment. Data Modeling & Warehousing: Design and implement data models to support business intelligence, reporting, and analytics. Optimize data warehouse performance for efficient querying and reporting. Collaboration & Communication: Collaborate effectively with cross-functional teams including product managers, software engineers, and business stakeholders. Requirements: Bachelor's or master's degree in computer science, Engineering, or a related field. 8+ years of proven experience in data engineering, with at least 3+ years in a lead role. Expertise in building and maintaining data pipelines using tools such as Apache Spark, Apache Kafka, Apache Beam, or similar. Proficiency in SQL and one or more programming languages like Python, Java, or Scala. Hands-on experience with cloud-based data platforms (AWS, Azure, GCP) and services. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Work Timings: 2.30 pm - 11.30 pm IST

Posted 2 months ago

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies