10 Dags Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: You will be responsible for designing, implementing, and optimizing Airflow Directed Acyclic Graphs (DAGs) for scalable workflows. Additionally, you will manage Google Cloud Platform (GCP) Composer environments, troubleshoot orchestration failures, and collaborate with data engineering teams to ensure seamless pipeline execution. Key Responsibilities: - Design, implement, and optimize Airflow DAGs for scalable workflows. - Manage GCP Composer environments including upgrades, performance tuning, and security. - Troubleshoot orchestration failures and optimize scheduling efficiency. - Collaborate with data engineering teams to ensure seamless pipeline execution. Qualifications R...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 - 0 Lacs

pune, maharashtra

On-site

As a Full-Time (FTE) team member in Pune or Chennai, you will be part of a dynamic team focused on GCP technologies. Your role will involve working with a range of tools including Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pub/Sub, GCS, Terraform, CI/CD, IAM, Networking, Governance, and Security. **Key Responsibilities:** - Work closely with Platform SMEs, Architects, Business Workspaces (Product/BA & Tech), and Platform Engineers (Specialist & Sr. Specialist). - Utilize your expertise in GCP technologies to contribute to the team's success. **Qualifications Required:** - Minimum of 5 years of experience in the field. - Proficiency in Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pu...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

7 - 11 Lacs

pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

6 - 10 Lacs

pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Develop and optimize data processing jobs using PySpark to handle complex data transformations and aggregations efficiently. Design and implement robust data pipelines on the AWS platform, ensuring scalability and efficiency (Databricks exposure will be an advantage) Leverage AWS services such as EC2, S3, etc. for comprehensive data processing and storage solutions. Expertly manage SQL database schema design, query optimization, and performance tuning to support data transformation and loading processes. Design and maintain scalable and performant data warehouses, employing best practices in data modeling and ETL processes. Utilize modern data platforms for collaborative data science, integr...

Posted 2 months ago

AI Match Score
Apply

6.0 - 10.0 years

10 - 20 Lacs

Pune

Work from Office

Job Description: Job Role: Data Engineer Role Yrs of Exp : 6+Years Job Location : Pune Work Model : Hybrid Job Summary: We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Ai...

Posted 2 months ago

AI Match Score
Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Remote

Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, ...

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

6 - 10 Lacs

Pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 3 months ago

AI Match Score
Apply

5.0 - 9.0 years

7 - 11 Lacs

Pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 3 months ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 20 Lacs

pune

Work from Office

Job Description: Job Role: Data Engineer Role Yrs of Exp : 4+Years Job Location : Pune Work Model : Hybrid Job Summary: We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Ai...

Posted Date not available

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies