Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 - 0 Lacs
pune, maharashtra
On-site
As a Platform SME, your role will involve designing, implementing, and optimizing Airflow Directed Acyclic Graphs (DAGs) for scalable workflows. Additionally, you will be responsible for managing GCP Composer environments, including upgrades, performance tuning, and security. Your expertise will be crucial in troubleshooting orchestration failures and optimizing scheduling efficiency. Collaboration with data engineering teams will be essential to ensure seamless pipeline execution. Key Responsibilities: - Design, implement, and optimize Airflow DAGs for scalable workflows. - Manage GCP Composer environments, including upgrades, performance tuning, and security. - Troubleshoot orchestration f...
Posted 1 week ago
5.0 - 9.0 years
0 - 0 Lacs
pune, maharashtra
On-site
As a part of our team, you will be responsible for the following: - Role Overview: You will be a part of a strong GCP-focused team working on various projects. The role requires expertise in Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pub/Sub, GCS, Terraform, CI/CD, IAM, Networking, Governance, and Security. - Key Responsibilities: - Collaborate with the team to design and implement solutions using GCP services. - Develop and maintain data pipelines using Airflow and Dataflow. - Manage BigQuery datasets and optimize queries for performance. - Implement security measures following IAM best practices. - Work on Terraform scripts for infrastructure provisioning. - Ensure CI/CD pipelines ar...
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, deploying, and managing infrastructure using Google Cloud Platform (GCP) services, including Compute Engine, Cloud Storage, IAM, and VPC networking. Additionally, you will manage and optimize Google Kubernetes Engine (GKE) clusters for containerized workloads, build and maintain robust CI/CD pipelines, integrate security and compliance checks into the deployment lifecycle, orchestrate and automate data workflows using GCP Composer, support GCP Dataflow pipelines, integrate with BigQuery, develop and deploy Cloud Functions, manage Cloud Storage buckets, and implement observability solutions using Cloud Monitoring and Cloud Logging. - Design, deploy, and ...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: You will be responsible for designing, implementing, and optimizing Airflow Directed Acyclic Graphs (DAGs) for scalable workflows. Additionally, you will manage Google Cloud Platform (GCP) Composer environments, troubleshoot orchestration failures, and collaborate with data engineering teams to ensure seamless pipeline execution. Key Responsibilities: - Design, implement, and optimize Airflow DAGs for scalable workflows. - Manage GCP Composer environments including upgrades, performance tuning, and security. - Troubleshoot orchestration failures and optimize scheduling efficiency. - Collaborate with data engineering teams to ensure seamless pipeline execution. Qualifications R...
Posted 2 months ago
5.0 - 9.0 years
0 - 0 Lacs
pune, maharashtra
On-site
As a Full-Time (FTE) team member in Pune or Chennai, you will be part of a dynamic team focused on GCP technologies. Your role will involve working with a range of tools including Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pub/Sub, GCS, Terraform, CI/CD, IAM, Networking, Governance, and Security. **Key Responsibilities:** - Work closely with Platform SMEs, Architects, Business Workspaces (Product/BA & Tech), and Platform Engineers (Specialist & Sr. Specialist). - Utilize your expertise in GCP technologies to contribute to the team's success. **Qualifications Required:** - Minimum of 5 years of experience in the field. - Proficiency in Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pu...
Posted 2 months ago
0.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Job Title: Data Engineer Role Overview: We are seeking a skilled and motivated Data Engineer to design, develop, and maintain scalable data pipelines that process and transform large volumes of structured and unstructured data. The ideal candidate will have a strong background in data engineering, cloud technologies, and data architecture, and will collaborate closely with data scientists, analysts, and business teams to deliver high-quality data solutions. Key Responsibilities: Design and implement scalable data pipelines for processing structured and unstructured data. Build and maintain ETL/ELT workflows for data ingestion from diverse sources including APIs, databases, files, and cloud p...
Posted 3 months ago
4.0 - 8.0 years
22 - 27 Lacs
bengaluru, banglore
Hybrid
Work Mode: Hybrid (35 Days from Office) Experience: 5+ Years Job Summary: Looking for a Senior Data Engineer to work on building and managing scalable data pipelines in a GCP environment. Responsibilities: Design, develop, and maintain scalable data pipelines. Manage ETL/ELT workflows from various data sources. Ensure data governance, quality, and transformation. Collaborate with stakeholders to meet data needs. Optimize for cost, scalability, and performance. Required Skill Set: Languages & Frameworks: Python, PySpark, SQL Orchestration Tools: Airflow / GCP Composer Streaming: Kafka Cloud Platform: Google Cloud Platform (Dataproc, BigQuery, Compute, Looker) Concepts: Data Modeling, Data War...
Posted Date not available
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
174558 Jobs | Dublin
Wipro
55192 Jobs | Bengaluru
EY
44116 Jobs | London
Accenture in India
37169 Jobs | Dublin 2
Turing
30851 Jobs | San Francisco
Uplers
30086 Jobs | Ahmedabad
IBM
27225 Jobs | Armonk
Capgemini
23907 Jobs | Paris,France
Accenture services Pvt Ltd
23788 Jobs |
Infosys
23603 Jobs | Bangalore,Karnataka