974 Apache Airflow Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Job Description: RiskSpan Technologies is a leading technology and data solutions company specializing in delivering innovative and scalable solutions to complex challenges in the financial services and technology sectors. Join the dynamic team at RiskSpan Technologies to work on cutting-edge projects and contribute to building scalable and efficient systems. Role Overview: As a Senior Python Developer at RiskSpan Technologies, you will be responsible for designing, developing, and maintaining scalable data applications and optimizing data pipelines. Your role will involve working on data engineering, workflow automation, and advanced analytics projects as an integral part of the R&D and Tec...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

You will be responsible for designing, developing, and supporting scalable ETL processes using open source tools and data frameworks such as AWS Glue, AWS Athena, Redshift, Apache Kafka, Apache Spark, Apache Airflow, and Pentaho Data Integration (PDI). Additionally, you will design, create, and maintain data lakes and data warehouse on the AWS cloud. Your role will involve maintaining and optimizing the data pipeline architecture, formulating complex SQL queries for big data processing, collaborating with product and engineering teams for designing a platform for data modeling and machine learning operations, implementing various data structures and algorithms, ensuring data privacy and comp...

Posted 3 weeks ago

AI Match Score
Apply

10.0 - 14.0 years

0 Lacs

haryana

On-site

As a Senior Data Engineer at Infogain, you will be responsible for leading the design and execution of the Dataproc to Databricks PySpark migration roadmap. Your role will involve defining a modernization strategy encompassing data ingestion, transformation, orchestration, and governance. Additionally, you will architect scalable solutions using Delta Lake and Unity Catalog, ensuring optimal performance and cost efficiency. Key Responsibilities: - Lead the design and execution of Dataproc to Databricks PySpark migration roadmap. - Define a modernization strategy for data ingestion, transformation, orchestration, and governance. - Architect scalable Delta Lake and Unity Catalog-based solution...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Pipeline Engineer/ Data Lineage Engineer at Solidatus, your primary responsibility will be to prepare data lineage solutions for clients" existing data pipelines. You will collaborate with cross-functional teams to ensure the integrity, accuracy, and timeliness of data lineage solutions, directly impacting the value clients gain from the product. Your role will involve working closely with clients to help them achieve their contractual goals. **Experience:** - Proven experience as a Data Engineer or in a similar role, with hands-on experience in building and optimizing data pipelines and infrastructure. - Strong problem-solving and analytical skills to diagnose and resolve complex ...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Data Engineer, your role will involve: - Conducting tests on data pipelines and evaluating results against data quality and performance specifications. - Applying and optimizing data models for efficient storage, retrieval, and processing of large datasets. - Communicating and explaining design/development aspects to customers. - Estimating time and resource requirements for developing/debugging features/components. - Participating in RFP responses and solutioning. - Mentoring team members and guiding them in relevant upskilling and certification. Qualifications required: - Bachelors or masters degree The company also values strong development experience in Snowflake, Cloud (AWS, GCP), ...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a hands-on Data Engineering Architect at Hexagon, you will play a crucial role in designing and building scalable data pipelines, implementing cutting-edge Generative AI features, and architecting robust data solutions. Your responsibilities will include: - **Architecture & Design** - Designing and implementing end-to-end data pipelines supporting batch and real-time processing - Architecting scalable data solutions using modern cloud-native patterns and microservices - Developing comprehensive data strategies integrating traditional databases with cloud data platforms - Leading technical decision-making for data platform evolution and technology stack optimization - **Generative AI & Mac...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

guwahati, assam

On-site

As an experienced and highly skilled Software Engineer specializing in Data, your role will involve designing, implementing, and optimizing large-scale data systems. You should have a proven track record in building efficient data pipelines, managing big data systems, and collaborating with cross-functional teams to deliver data-driven solutions. **Key Responsibilities:** - Design and maintain scalable, reliable data pipelines and workflows. - Create data models and implement schemas to support business objectives. - Monitor and optimize database performance and query efficiency. - Collaborate with senior team members to design data models and schemas. - Perform data cleansing, preparation, ...

Posted 3 weeks ago

AI Match Score
Apply

8.0 - 10.0 years

8 - 12 Lacs

mumbai, chennai

Hybrid

Location : Chennai Hybrid (3 Days a Week in Office) Employment Type : Contract (1 Year, Extendable) Notice Period : Immediate Joiners Only Role Overview: We are hiring a Senior Python Developer / Lead with strong experience in Python, SQL, AWS, and Airflow. The candidate will design, develop, and lead backend and data solutions in a cloud-based environment. Key Responsibilities: Lead the design and development of Python-based data pipelines and APIs. Work with AWS services (Lambda, S3, Glue, Redshift, EC2) for deployment and integration. Implement Airflow DAGs for data orchestration and automation. Optimize SQL queries and database performance. Collaborate with Data Engineers and DevOps team...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

visakhapatnam

Remote

Contract Duration : 4 Months +Exendible based on Performance Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse ...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

lucknow

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, ...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

bengaluru

Remote

Contract Duration : 4 Months +Exendible based on Performance Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse ...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

jaipur

Remote

Contract Duration : 4 Months +Exendible based on Performance Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse ...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

ludhiana

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, ...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

hyderabad

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, ...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

11 - 15 Lacs

pune

Work from Office

Title and Summary Senior Software Engineer (Pyspark, Hadoop, Python, SQL, Airflow)Job Summary:As a Senior Software Engineer focused on Data Quality, you will lead the design, development, and deployment of scalable data quality frameworks and pipelines. You will work closely with data engineers, analysts, and business stakeholders to build robust solutions that validate, monitor, and improve data quality across large-scale distributed systems. Key Responsibilities: Lead the design and implementation of data quality frameworks and automated validation pipelines using Python, Apache Spark, and Hadoop ecosystem tools.Develop, deploy, and maintain scalable ETL/ELT workflows using Apache Airflow ...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

4 - 7 Lacs

chennai, bengaluru

Work from Office

Job Description Job Title: PySpark Data Engineer Summary: We are seeking a skilled PySpark Data Engineer to join our team and drive the development of robust data processing and transformation solutions within our data platform. You will be responsible for designing, implementing, and maintaining PySpark-based applications to handle complex data processing tasks, ensure data quality, and integrate with diverse data sources. The ideal candidate possesses strong PySpark development skills, experience with big data technologies, and the ability to work in a fast-paced, data-driven environment. Key Responsibilities: Data Engineering Development: Design, develop, and test PySpark-based applicatio...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 9 Lacs

chennai, bengaluru

Work from Office

Develop and maintain data pipelines, ELT processes, and workflow orchestration using Apache Airflow, Python and PySpark to ensure the efficient and reliable delivery of data. Design and implement custom connectors to facilitate the ingestion of diverse data sources into our platform, including structured and unstructured data from various document formats . Collaborate closely with cross-functional teams to gather requirements, understand data needs, and translate them into technical solutions. Implement DataOps principles and best practices to ensure robust data operations and efficient data delivery. Design and implement data CI/CD pipelines to enable automated and efficient data integrati...

Posted 3 weeks ago

AI Match Score
Apply

2.0 - 5.0 years

6 - 10 Lacs

mumbai, bengaluru, delhi / ncr

Work from Office

We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. Key Responsibilities Design, develop, and maintain scalable data pipelines and big data solutions Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop Process large data volumes from diverse sources using Hadoop ecosystem tools Build end-to-end data workflows for batch and streaming pipelines Optimize data storage and retrieval processes in HBase, ...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 10.0 years

6 - 10 Lacs

hyderabad

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-...

Posted 3 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

8 - 12 Lacs

pune

Work from Office

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

nashik

Remote

Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality d...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

nagpur

Remote

Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality d...

Posted 3 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

kolkata

Remote

Job Description : We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality d...

Posted 3 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

10 - 15 Lacs

bengaluru

Work from Office

Project description We are looking for a skilled Data Engineer with strong experience in AWS and Apache Airflow to join a dynamic data team. You will be responsible for building and maintaining scalable data pipelines, orchestrating workflows, and ensuring data quality and availability across platforms. Responsibilities ?Data Pipeline DevelopmentDesign, build, and maintain robust, scalable, and efficient ETL/ELT pipelines for ingesting, transforming, and loading data from various sources into our data lake and data warehouse. ?AWS ExpertiseDevelop and manage data solutions using a wide range of AWS services, including but not limited to ?StorageS3 (for data lakes), RDS (PostgreSQL, MySQL), R...

Posted 3 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

14 - 18 Lacs

chennai

Work from Office

Project description We have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Responsibilities We are looking for an experienced Senior DevOps who can Design, implement, and maintain scalable cloud infrastructure on AWS, leveraging services like ECS, AWS Managed workflows for Apache Airflow, Lambda, and S3; Develop and manage infrastructure-as-code using Terraform for efficient environment management; Build and maintain CI/CD pipelines using tools such as Jenkins, Jules or Spinnaker; Manage ECS clusters, including configuration, deployment, and monitoring (Airflow ...

Posted 3 weeks ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies