682 Apache Airflow Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

As a skilled and experienced Technology Stack Administrator, your role will involve managing and optimizing the on-premises infrastructure comprising RedHat OpenShift, Apache Spark, Apache Airflow, Delta Lake, Apache Kafka, and Debezium. Your responsibilities will include: - Platform Administration: - Installing, configuring, and maintaining RedHat OpenShift clusters in an on-premises environment. - Building and orchestrating data pipelines using Apache Airflow. - Implementing and managing Delta Lake for ACID-compliant data lake architecture. - Managing Apache Spark clusters for distributed data processing. - Developing real-time data streaming and CDC solutions using Apache Kafka and Debezi...

Posted 5 days ago

AI Match Score
Apply

8.0 - 12.0 years

10 - 20 Lacs

pune, chennai, bengaluru

Hybrid

Primary Skills Databricks, Python, Dimensional Modelling & Data warehousing Design, develop, and optimize scalable data pipelines and frameworks on Azure Cloud using services such as Azure Data Lake Gen2, Implement robust ETL/ELT workflows leveraging Databricks and orchestrate data jobs using Apache Airflow. Apply advanced dimensional modelling techniques to support analytics and reporting requirements. Deploy solutions using CI/CD pipelines for seamless deployment and integration of data engineering solutions. Ensure high data quality and system reliability by implementing automated testing, deployment, and monitoring practices. Collaborate with cross-functional teams to gather requirements...

Posted 1 week ago

AI Match Score
Apply

8.0 - 12.0 years

4 - 8 Lacs

chennai

Work from Office

Job Title: Python Lead Developer / Python developers, AWS services (S3, Lambda, EC2, Glue, Redshift, RDS, etc.) for deployment, data storage, and automation. Location : Chennai Hybrid (3 Days a Week in Office) Experience : 8 - 10 Years Employment Type : Contract Shift : UK Shift (if applicable) Job Summary We are looking for a highly skilled Python Lead Developer with strong expertise in Python, SQL, AWS, and Apache Airflow. The ideal candidate will lead a team of developers to build scalable data pipelines, backend systems, and cloud-based solutions, ensuring high performance, maintainability, and adherence to best practices. Key Responsibilities Lead and mentor a team of Python developers,...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

25 - 37 Lacs

bengaluru

Work from Office

About Position: We are seeking for Data Analyst with hands on experience in Databricks, PySpark and in AWS, Apache Airflow, ETL, SQL. Role: Data Analyst Location: Bengaluru Experience: 5 to 11 years Job Type: Full Time Employment What You'll Do: Designing, creating, testing and maintaining the complete data management & processing systems. Working closely with the stakeholders & solution architect. Ensuring architecture meets the business requirements. Building highly scalable, robust & fault-tolerant systems. Discovering data acquisitions opportunities Finding ways & methods to find value out of existing data. Improving data quality, reliability & efficiency of the individual components & t...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Full Stack MLOps Developer/Engineer, you will be responsible for designing, developing, and deploying scalable machine learning solutions and cloud-native applications, ensuring seamless integration and automation across platforms. Your role will involve collaborating with data scientists and DevOps teams to automate ML workflows and deployment processes. Additionally, you will be involved in developing and maintaining end-to-end full stack applications with a focus on ML deployment and automation. Key Responsibilities: - Develop and maintain end-to-end full stack applications with a focus on ML deployment and automation. - Design, implement, and manage scalable MLOps pipelines utilizin...

Posted 1 week ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

Role Overview: As the Team Lead of Data Engineering, you will be responsible for driving the design, development, and optimization of data engineering pipelines and services with a strong software engineering mindset. Your role will also involve fostering growth and collaboration within the team, mentoring junior colleagues, and refining processes to maintain excellence. If you are eager to have a lasting impact both technically and as a team leader, we are excited to meet you. Key Responsibilities: - Utilize very strong software engineering principles to write maintainable, scalable code, primarily in Python (experience with Rust or C# is a plus). - Perform data processing and manipulation ...

Posted 1 week ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As an Engineering Manager (Data Engineering) at Amgen, your role will involve leading and mentoring a team of data engineers in the R&D domain of biotech or pharma companies. You will foster a culture of innovation, collaboration, and continuous learning to solve complex problems within the R&D division. Your responsibilities will include overseeing the development of data extraction, validation, and transformation techniques to ensure high-quality data for downstream systems. Additionally, you will guide the team in writing and validating high-quality code for data processing and transformation, as well as drive the development of data tools and frameworks for efficient data management acro...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Specialist, you will play a crucial role in handling data engineering tasks with a focus on GCP cloud data implementation. Your responsibilities will include: - Utilizing ETL Fundamentals, SQL, BigQuery, Dataproc, Python, Data Catalog, Data Warehousing, Composer, Dataflow, Cloud Trace, Cloud Logging, Cloud Storage, Datafusion, and other tools in the GCP cloud data implementation suite. - Demonstrating expertise in very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. - Performing hands-on tasks with technologies like GBQ Query, Python, Apache Airflow, and SQL, with a preference for BigQuery. - Working extens...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a skilled Data Engineer with experience in Google Cloud Platform (GCP), PySpark, SQL, and ETL processes, your role will involve building, optimizing, and maintaining scalable data pipelines and workflows. You will utilize technologies like Apache Airflow, PySpark, and other cloud-native tools. Key Responsibilities: - Design, develop, and maintain efficient and scalable data pipelines using PySpark and SQL. - Build and manage workflows/orchestration with Apache Airflow. - Utilize GCP services such as BigQuery, Cloud Storage, Dataflow, and Composer. - Implement and optimize ETL processes to ensure data quality, consistency, and reliability. - Collaborate with data analysts, data scientists,...

Posted 1 week ago

AI Match Score
Apply

5.0 - 22.0 years

0 Lacs

maharashtra

On-site

Role Overview: As a Data Architect at DATAECONOMY, you will be responsible for leading the design and implementation of scalable, secure, and high-performance data solutions. Your expertise in data architecture, cloud platforms, and modern data frameworks will be crucial in guiding clients and teams in building next-generation data platforms. Key Responsibilities: - Lead end-to-end design and architecture of enterprise-scale data solutions across cloud and hybrid environments. - Define and implement data strategies, governance, and architecture best practices aligned to business needs. - Architect and optimize data pipelines, ETL/ELT frameworks, and real-time data processing solutions. - Pro...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Cloud Developer specializing in Google Cloud Platform (GCP), Python, and Apache Airflow, your role at Hireflex will involve designing and developing cloud-native applications and APIs. You will work with frameworks such as Django, FastAPI, or Flask, deploying microservices on GCP Cloud Run with Docker, and managing data workflows using Apache Airflow. Your responsibilities will also include maintaining and securing cloud infrastructure using various GCP services, implementing best practices in GCP security, and collaborating with cross-functional teams in an Agile/Scrum environment. Key Responsibilities: - Design and develop cloud-native applications and APIs using Python and frameworks...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Science Team Lead at Onecom, you will have the opportunity to lead a talented team of data engineers, shape the development and management of internal and external data, and drive technical excellence and continuous improvement. If you are a data-driven leader with a passion for building scalable solutions and empowering teams, this role is perfect for you. **Key Responsibilities:** - Lead and mentor a team of data engineers, supporting technical growth and career development - Define and prioritize team goals aligned with business strategy - Oversee project planning, resource allocation, and delivery timelines - Translate business requirements into scalable technical solutions - E...

Posted 2 weeks ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

kochi, kerala

On-site

As a Data Engineer with over 6 years of experience, you will be responsible for designing and implementing scalable data solutions. Your deep expertise in cloud data warehousing, ETL/ELT processes, data modeling, and business intelligence will play a crucial role in this position. Key Responsibilities: - Support the design and implementation of end-to-end data solutions by leveraging technologies such as AWS Redshift, Apache Airflow, dbt, and other modern data tools including DataBricks. - Develop data models and implement data pipelines to ingest, transform, and load data from various sources into the data warehouse. - Create and maintain Apache Airflow DAGs to orchestrate complex data work...

Posted 2 weeks ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Role Overview: As a Senior Data Engineer, your primary responsibility is to design and optimize data pipelines, ETL workflows, and cloud-based data solutions using your expertise in Databricks, Apache Airflow, SQL, and Python. You play a crucial role in driving efficient and scalable data engineering solutions by implementing big data processing techniques and automation. Key Responsibilities: - Design, develop, and optimize data pipelines utilizing Databricks and Apache Airflow. - Implement PySpark-based transformations and processing in Databricks for handling large-scale data efficiently. - Develop and maintain SQL-based data pipelines, focusing on performance tuning and optimization. - C...

Posted 2 weeks ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

andhra pradesh

On-site

As an experienced ETL Developer, your role will involve developing and optimizing scalable data pipelines for efficient extraction, transformation, and loading (ETL) of data from diverse sources. You will leverage technologies like Spark, Dask, and other modern ETL frameworks to ensure smooth data processing. **Key Responsibilities:** - Developing and implementing data quality controls, including monitoring and remediation processes - Providing technical guidance and mentorship to junior ETL developers - Collaborating with the infrastructure team to ensure the availability, reliability, and scalability of ETL solutions - Participating in code reviews and contributing to the development of co...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 8.0 years

5 - 8 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

We are seeking a Data Integration Engineer with 58 years of experience in building and orchestrating data pipelines using Apache Airflow and integrating data into Snowflake The role involves designing and maintaining Airflow DAGs, integrating structured and unstructured data via JDBC connectors, REST APIs, and flat file ingestion The candidate should have strong hands-on experience with Postman for API validation, SQL encryption/decryption techniques for securing sensitive datasets, and Python for transformation and ETL logic Responsibilities include implementing detailed logging, monitoring, and alerting for pipeline reliability, conducting data quality checks such as row-level validation a...

Posted 2 weeks ago

AI Match Score
Apply

5.0 - 8.0 years

5 - 8 Lacs

mumbai, delhi / ncr, bengaluru

Work from Office

We are seeking a Data Integration Engineer with 5- 8 years of experience in building and orchestrating data pipelines using Apache Airflow and integrating data into Snowflake The role involves designing and maintaining Airflow DAGs, integrating structured and unstructured data via JDBC connectors, REST APIs, and flat file ingestion The candidate should have strong hands-on experience with Postman for API validation, SQL encryption/decryption techniques for securing sensitive datasets, and Python for transformation and ETL logic Responsibilities include implementing detailed logging, monitoring, and alerting for pipeline reliability, conducting data quality checks such as row-level validation...

Posted 2 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

nashik

Work from Office

Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using A...

Posted 2 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

nagpur

Work from Office

Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using A...

Posted 2 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

8 - 12 Lacs

surat

Work from Office

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productioni...

Posted 2 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

ahmedabad

Work from Office

Job Title : Databricks Developer (Contract) Contract Duration : 4 Months +Exendible based on Performance Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 4+ Years We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using A...

Posted 2 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

8 - 12 Lacs

hyderabad

Work from Office

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productioni...

Posted 2 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

8 - 12 Lacs

jaipur

Work from Office

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productioni...

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Grafana Developer at our organization, you will play a critical role in designing, implementing, and maintaining Grafana dashboards to visualize complex data sets. Your responsibilities will include integrating various data sources, such as databases, APIs, and other tools, to provide real-time insights and analytics through effective visualizations. By utilizing your skills in Apache Airflow, you will ensure that data pipelines are efficiently managed, supporting timely and accurate data visualization. Your role is essential for driving data-driven decision-making processes across departments, fostering a culture of transparency and accountability. You will work closely with data engin...

Posted 2 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer specializing in DataOps & Support, your role will involve a combination of technical troubleshooting, analytical problem-solving, and proactive management of data integrity issues within enterprise-scale data systems. Your primary focus will be on ensuring the health, reliability, and accuracy of critical data workflows that drive business operations. Key Responsibilities: - Proactively monitor data pipelines for Tune-In & Local reports to ensure ongoing reliability. - Analyze and debug data integrity issues, escalating bugs when necessary. - Troubleshoot DAG execution failures in different environments using tools such as Apache Airflow. - Create and maintain detailed doc...

Posted 2 weeks ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies