14 Dags Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

0 Lacs

hyderabad, telangana, india

Remote

Data Engineering Lead Analyst Position Overview Evernorth Core Platform Engineering team is looking for a Data Engineer to design, develop, and implement robust data pipelines for Packaged Business Capabilities (PBCs). You'll work closely with data scientists, analysts, and other engineers to ensure high data quality, performance, and availability. The ideal candidate brings hands-on experience working with big data tools and platforms such as Apache Spark and Databricks, has a strong command of Python, and is comfortable managing ETL workflows, DAGs, and orchestration using tools like Jenkins or Airflow. Experience in cloud-native environmentsparticularly AWS and AWS Glue is essential. Abou...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a GCP Data Engineer, you will be responsible for developing data warehousing projects on GCP platforms. Your role will require around 3+ years of experience in this field. You should possess strong analytical and problem-solving skills, effective business communication abilities, and the capacity to work independently from beginning to end. Key Responsibilities: - Demonstrating strong expertise in SQL & PL/SQL - Utilizing GCP services & BigQuery knowledge; GCP certification is considered an added advantage - Having good experience in GCP Dataproc, Cloud Composer, DAGs, and airflow - Exhibiting proficiency in Teradata or any other database - Python knowledge is an added advantage - Leading...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 8.0 years

10 - 20 Lacs

gurugram

Work from Office

Job Description :- Python Developer (Airflow & ETL Specialist) Experience: 5+ Years Location: Gurgaon / Remote (as applicable) About the Role We are looking for an experienced Python Developer with strong hands-on experience in Apache Airflow and ETL pipeline development. The ideal candidate will be responsible for designing, building, and maintaining data workflows, ensuring scalability, performance, and reliability of data processing systems. This role requires a deep understanding of data integration, orchestration, and automation within modern cloud or on-prem data ecosystems. Key Responsibilities Design, develop, and maintain ETL workflows and data pipelines using Python and Apache Airf...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 6.0 years

0 Lacs

pune, maharashtra, india

On-site

About Certify: At CertifyOS, we're building the infrastructure that powers the next generation of provider data products, making healthcare more efficient, accessible, and innovative. Our platform is the ultimate source of truth for provider data, offering unparalleled ease and trust while making data easily accessible and actionable for the entire healthcare ecosystem. What sets us apart Our cutting-edge, API-first, UI-agnostic, end-to-end provider network management platform automates licensing, enrollment, credentialing, and network monitoring like never before. With direct integrations into hundreds of primary sources, we have an unbeatable advantage in enhancing visibility into the enti...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: You will be responsible for designing, implementing, and optimizing Airflow Directed Acyclic Graphs (DAGs) for scalable workflows. Additionally, you will manage Google Cloud Platform (GCP) Composer environments, troubleshoot orchestration failures, and collaborate with data engineering teams to ensure seamless pipeline execution. Key Responsibilities: - Design, implement, and optimize Airflow DAGs for scalable workflows. - Manage GCP Composer environments including upgrades, performance tuning, and security. - Troubleshoot orchestration failures and optimize scheduling efficiency. - Collaborate with data engineering teams to ensure seamless pipeline execution. Qualifications R...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

0 - 0 Lacs

pune, maharashtra

On-site

As a Full-Time (FTE) team member in Pune or Chennai, you will be part of a dynamic team focused on GCP technologies. Your role will involve working with a range of tools including Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pub/Sub, GCS, Terraform, CI/CD, IAM, Networking, Governance, and Security. **Key Responsibilities:** - Work closely with Platform SMEs, Architects, Business Workspaces (Product/BA & Tech), and Platform Engineers (Specialist & Sr. Specialist). - Utilize your expertise in GCP technologies to contribute to the team's success. **Qualifications Required:** - Minimum of 5 years of experience in the field. - Proficiency in Airflow, GCP Composer, DAGs, BigQuery, Dataflow, Pu...

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

7 - 11 Lacs

pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 2 months ago

AI Match Score
Apply

5.0 - 9.0 years

6 - 10 Lacs

pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 2 months ago

AI Match Score
Apply

5.0 - 8.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Develop and optimize data processing jobs using PySpark to handle complex data transformations and aggregations efficiently. Design and implement robust data pipelines on the AWS platform, ensuring scalability and efficiency (Databricks exposure will be an advantage) Leverage AWS services such as EC2, S3, etc. for comprehensive data processing and storage solutions. Expertly manage SQL database schema design, query optimization, and performance tuning to support data transformation and loading processes. Design and maintain scalable and performant data warehouses, employing best practices in data modeling and ETL processes. Utilize modern data platforms for collaborative data science, integr...

Posted 4 months ago

AI Match Score
Apply

6.0 - 10.0 years

10 - 20 Lacs

Pune

Work from Office

Job Description: Job Role: Data Engineer Role Yrs of Exp : 6+Years Job Location : Pune Work Model : Hybrid Job Summary: We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Ai...

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Remote

Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, ...

Posted 4 months ago

AI Match Score
Apply

5.0 - 9.0 years

6 - 10 Lacs

Pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 4 months ago

AI Match Score
Apply

5.0 - 9.0 years

7 - 11 Lacs

Pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 4 months ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 20 Lacs

pune

Work from Office

Job Description: Job Role: Data Engineer Role Yrs of Exp : 4+Years Job Location : Pune Work Model : Hybrid Job Summary: We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Ai...

Posted Date not available

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies