682 Apache Airflow Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 8.0 years

10 - 15 Lacs

bengaluru

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and ...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

surat

Remote

Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow o...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

thane

Remote

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

ahmedabad

Remote

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

mumbai

Remote

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

ludhiana

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

agra

Remote

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality data solutions. Req...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

lucknow

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

hyderabad

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

chennai

Remote

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality data solutions. Req...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

kanpur

Remote

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality data solutions. Req...

Posted 3 weeks ago

AI Match Score
Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As an experienced Lead Data Engineer, your role will involve driving data engineering initiatives, architecting scalable data solutions, and leading a team of engineers. You specialize in Snowflake, Matillion, and Cloud platforms (Azure/AWS) and have expertise in BI tools like Power BI, Tableau. You are required to have strong SQL skills, analytical capabilities, excellent communication, and leadership abilities. Your responsibilities will include: - Designing and developing scalable, high-performance data solutions using Snowflake, Matillion, and cloud environments (Azure/AWS). - Building and optimizing ETL/ELT pipelines leveraging Matillion, Snowflake native capabilities, dbt, and other da...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Data Engineer II, your role involves managing the deprecation of migrated workflows and ensuring the seamless migration of workflows into new systems. Your expertise in building and maintaining scalable data pipelines, both on-premises and on the cloud, will be crucial. You should have a deep understanding of input and output data sources, upstream downstream dependencies, and data quality assurance. Proficiency in tools like Git, Apache Airflow, Apache Spark, SQL, data migration, and data validation is essential for this role. Your key responsibilities will include: - Workflow Deprecation: - Evaluate current workflows" dependencies and consumption for deprecation. - Identify, mark, and...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced IT professional with over 5 years of expertise, you possess the skills required to excel in the role of building data pipelines using Apache Airflow. Your proficiency in developing a codebase for generating data pipelines, specifically in Apache Airflow, showcases your strong technical background. Key Responsibilities: - Developed and orchestrated scalable data pipelines in Apache Airflow using Python - Processed records from APIs and loaded them to GCP BigQuery - Demonstrated understanding and ability to leverage scalable pipelines effectively - Utilized Python programming language for data pipeline development - Exposure to ML is preferred, although not mandatory - Utiliz...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 12.0 years

15 - 20 Lacs

noida

Hybrid

Job Title: Senior Data Engineer, Microsoft Fabric + SSIS Specialist Location: Noida, Sector 62 ( Onsite/Hybrid, Full-Time, Long-Term Proje ct) Experience Required: 7 + Years Availability: Immediate Joiners Only Role Overview: We are looking for a highly skilled Senior Data Engineer 7+ years of experience as a Microsoft Fabric Specialist to lead the design, development, and optimization of modern data solutions. The role involves working extensively with Azure Fabric, ADF, Databricks, SSIS, Power BI, and PySpark to build scalable, high-performance data pipelines and analytics systems. Key Responsibilities: Lead the design, development, and implementation of Fabric Pipelines, SSIS Packages, Da...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Senior Data Engineer at Trading Technologies, you will be responsible for developing scalable data pipelines and infrastructure, with a focus on data engineering technologies such as Snowflake, AWS, ETL processes, and tools. **Key Responsibilities:** - Data Pipeline Development: - Build, optimize, and maintain ETL/ELT pipelines to process large-scale data from various sources. - Implement data ingestion and transformation workflows using tools like Fivetran, Apache Airflow, and DBT. - Cloud Integration: - Utilize AWS services (S3, Glue, Athena, Lambda) for efficient data storage and processing. - Collaborate with DevOps to ensure cloud resources are optimized for performance and cost. -...

Posted 3 weeks ago

AI Match Score
Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

Job Description: As a Principal Software Engineer, you will provide technical leadership within development teams, ensuring architectural coherence, collaboration, and best practices. Your role involves bridging the gap between data engineering, data science, and software engineering to help build scalable, maintainable data solutions. You will design and implement solutions, mentor developers, influence technical direction, and drive best practices. Additionally, you will have line management responsibilities, focusing on the growth and development of team members. The role will involve working in an Azure and Databricks environment, leveraging cloud-native technologies and modern data plat...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a highly skilled and motivated Cloud Data Engineering Manager at Merkle, your role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns on Google Cloud Platform (GCP). **Key Responsibilities:** - **Data Engineering & Development:** - Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. - Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. - Develop and optimize data architectures that support real-time and batch data process...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Engineer with 5 to 7 years of experience, your role will involve architecting and maintaining scalable, secure, and reliable data platforms and pipelines. You will be responsible for designing and implementing data lake/data warehouse solutions using technologies such as Redshift, BigQuery, Snowflake, or Delta Lake. Your key responsibilities will include: - Building real-time and batch data pipelines using tools like Apache Airflow, Kafka, Spark, and DBT - Ensuring data governance, lineage, quality, and observability - Collaborating with stakeholders to define data strategies, architecture, and KPIs - Leading code reviews and enforcing best practices - Mentoring junior and mid-leve...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You are an experienced GCP Analytics professional with 6+ years of hands-on expertise in data engineering. You are capable of designing, developing, and optimizing cloud-based data solutions using cutting-edge tools on the Google Cloud Platform. **Key Responsibilities:** - Design and implement robust data pipelines and workflows on GCP - Optimize and scale data processes using BigQuery and PySpark - Collaborate with cross-functional teams to deliver real-time data solutions - Leverage Apache Airflow for orchestration and automation - Apply strong programming skills in Python, Java, and Scala for data transformation **Qualifications Required:** - 6+ years of experience in data engineering/ana...

Posted 4 weeks ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

As a Technology Stack Administrator at Bajaj Allianz Life Insurance, your role involves managing and optimizing the on-premises infrastructure comprising RedHat OpenShift, Apache Spark, Apache Airflow, Delta Lake, Apache Kafka, and Debezium. Your responsibilities include: - **Platform Administration:** - Install, configure, and maintain RedHat OpenShift clusters in an on-premises environment. - Build and orchestrate data pipelines using Apache Airflow. - Implement and manage Delta Lake for ACID-compliant data lake architecture. - Manage Apache Spark clusters for distributed data processing. - Administer and maintain Apache Spark clusters, ensuring optimal performance and resource utilization...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

Role Overview: As a Consultant Engineer for the Liquidity Program in Gurugram, India, you will play a crucial role in designing and building liquidity calculations using the bespoke Data Calculation Platform (DCP) based on documented business requirements. You will be part of a team located in both Sydney and Gurugram, reporting to a manager in Gurugram with project leadership in Sydney. Your primary focus will be ingesting data from producers and implementing essential liquidity calculations within a cutting-edge data platform, ensuring high performance, robustness, and stability to meet the business needs of internal and external stakeholders. Key Responsibilities: - Utilize your expertise...

Posted 1 month ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Cloud Solution Delivery Sr Advisor at NTT DATA, your role involves leading and directing a team of engineers in implementing end-to-end data solutions using AWS services such as Lambda, S3, Snowflake, DBT, and Apache Airflow. Your responsibilities include cataloguing data, collaborating with cross-functional teams to translate business requirements into technical solutions, and providing documentation for downstream teams to develop, test, and run data products efficiently. You will also be involved in testing tooling, delivering CI/CD and IaC, defining re-usable pipelines using DBT projects, and ensuring the best practice use of version control and Agile methodologies. Key Responsibili...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Engineer at our company, you will play a critical role in designing, developing, and maintaining data pipeline architecture to ensure the efficient flow of data across the organization. **Key Responsibilities:** - Develop robust and scalable data pipelines using AWS Glue, Apache Airflow, and other relevant technologies. - Integrate various data sources, including SAP HANA, Kafka, and SQL databases, for seamless data flow and processing. - Optimize data pipelines for performance and reliability. **Data Management And Transformation:** - Design and implement data transformation processes to clean, enrich, and structure data for analytical purposes. - Utilize SQL and Python for data e...

Posted 1 month ago

AI Match Score
Apply

5.0 - 10.0 years

30 - 35 Lacs

pune

Hybrid

Hi, Greetings!!! Full time employment with "Product based Company" Job details Role: Sr Data Engineer Job Type - Full Time Opportunity Experience: 6+ years Location : Pune Work Mode : Hybrid We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Required Skills & Qualifications: 4+ years of hands-on experience in Data engineering roles. Strong experience with DBT for modular, testable, and ver...

Posted 1 month ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies