974 Apache Airflow Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 8.0 years

10 - 15 Lacs

surat

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 4 weeks ago

AI Match Score
Apply

6.0 - 11.0 years

5 - 9 Lacs

chennai

Work from Office

We are seeking a passionate and skilled Data Engineer to join our dynamic US OBU Pelican report engineering team In this role, you will be responsible for designing, building, and maintaining scalable and efficient data pipelines and workflows that support essential business reporting and analytics functions This is a collaborative and impactful role, where your contributions will directly influence data-driven decision-making across the organization As a Data Engineer, you will design, develop, and optimize data pipelines using SQL and Python, ensuring they are robust and efficient for handling large volumes of data You will leverage platforms such as Databricks to implement complex data tr...

Posted 4 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

nagpur

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 4 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

nashik

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 4 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

agra

Remote

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality data solutions. Req...

Posted 4 weeks ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

kolkata

Remote

We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, GCP). - Build and manage data warehouse solutions, schema evolution, and data versioning. - Implement and manage workflow orchestration u...

Posted 4 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

chennai

Remote

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality data solutions. Req...

Posted 4 weeks ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 7 Lacs

kanpur

Remote

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment. Key Responsibilities : - Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift. - Work in multi-cloud environments including AWS, Azure, and GCP. - Implement workflow orchestration using Airflow or similar frameworks. - Design, implement, and manage data warehouse solutions, schema evolution, and data versioning. - Collaborate with cross-functional teams to deliver high-quality data solutions. Req...

Posted 4 weeks ago

AI Match Score
Apply

5.0 - 7.0 years

4 - 8 Lacs

hyderabad

Work from Office

Description : We are a technology consulting firm operating in Cloud Data Engineering and Analytics, helping enterprise customers build reliable, scalable data platforms and analytics products. Our teams deliver end-to-end data lakes, real-time streaming pipelines, and production-grade ML feature stores using Databricks and modern cloud data tooling. Role & Responsibilities : - Design, build, and maintain scalable batch and streaming ETL pipelines on Databricks using Delta Lake and Delta Live Tables (DLT). - Develop and optimize Spark/PySpark jobs for performance, cost-efficiency, and reliability; tune cluster sizing and autoscaling policies. - Implement data quality, observability, lineage ...

Posted 4 weeks ago

AI Match Score
Apply

7.0 - 10.0 years

7 - 11 Lacs

chennai

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-...

Posted 4 weeks ago

AI Match Score
Apply

5.0 - 7.0 years

4 - 8 Lacs

gurugram

Work from Office

We are a technology consulting firm operating in Cloud Data Engineering and Analytics, helping enterprise customers build reliable, scalable data platforms and analytics products. Our teams deliver end-to-end data lakes, real-time streaming pipelines, and production-grade ML feature stores using Databricks and modern cloud data tooling. Role & Responsibilities : - Design, build, and maintain scalable batch and streaming ETL pipelines on Databricks using Delta Lake and Delta Live Tables (DLT). - Develop and optimize Spark/PySpark jobs for performance, cost-efficiency, and reliability; tune cluster sizing and autoscaling policies. - Implement data quality, observability, lineage and monitoring...

Posted 4 weeks ago

AI Match Score
Apply

12.0 - 16.0 years

4 - 7 Lacs

pune

Work from Office

Role Overview: An AWS SME with a Data Science Background is responsible for leveraging Amazon Web Services (AWS) to design, implement, and manage data-driven solutions. This role involves a combination of cloud computing expertise and data science skills to optimize and innovate business processes. Key Responsibilities: Data Analysis and Modelling: Analyzing large datasets to derive actionable insights and building predictive models using AWS services like SageMaker, Bedrock, Textract etc. Cloud Infrastructure Management: Designing, deploying, and maintaining scalable cloud infrastructure on AWS to support data science workflows. Machine Learning Implementation: Developing and deploying mach...

Posted 4 weeks ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a Database Engineer at our company, your role will involve designing, implementing, and maintaining scalable databases and ETL pipelines for managing large volumes of time-series and cross-sectional data. The ideal candidate for this position should possess strong database design skills along with hands-on experience in Python and modern data workflow tools. **Key Responsibilities:** - Design, build, and maintain relational databases, primarily PostgreSQL, to support high-volume time-series and cross-sectional datasets - Develop, monitor, and optimize ETL pipelines to ensure reliability, scalability, and data integrity - Implement automated workflows using Apache Airflow or similar DAG or...

Posted 4 weeks ago

AI Match Score
Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing, developing, and maintaining scalable and efficient data processing pipelines using PySpark and Python. Your key responsibilities will include: - Building and implementing ETL (Extract, Transform, Load) processes to ingest data from various sources and load it into target destinations. - Optimizing PySpark applications for performance and troubleshooting existing code. - Ensuring data integrity and quality throughout the data lifecycle. - Collaborating with cross-functional teams, including data engineers and data scientists, to understand and fulfill data needs. - Providing technical leadership, conducting code reviews, and mentoring junior team members...

Posted 4 weeks ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Senior Cloud Developer at our company, you will be responsible for designing, developing, and deploying scalable data processing pipelines and orchestration workflows. Your expertise in Java (v17+), Google Cloud Platform (GCP), and data engineering frameworks will be crucial in ensuring high performance, reliability, and maintainability across large-scale systems. Key Responsibilities: - Design, develop, test, and deploy scalable and reliable data processing pipelines using Java 17+ and Apache Beam, executed on GCP Cloud Dataflow - Build and manage complex data orchestration workflows using Apache Airflow or GCP Cloud Composer, including creating and maintaining DAGs with various common...

Posted 4 weeks ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

noida, all india

On-site

As a Principal Engineer (Data DevOps) at our company, your role will involve leading and guiding the Data DevOps team to build, manage, and optimize high-scale, secure, and reliable big data platforms. You will be responsible for driving best practices in cloud infrastructure, automation, CI/CD, and big data technologies. Your main responsibilities will include: - Leading, mentoring, and cultivating a high-performing Data DevOps team to promote technical excellence and ownership. - Driving the architecture, design, and implementation of large-scale cloud and data infrastructure to ensure scalability, performance, and security. - Collaborating closely with Data Engineering, Data Science, Anal...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

pune, all india

On-site

Role Overview: As a Senior Data Engineer at DATAECONOMY, you will be responsible for leading the end-to-end development of complex models for compliance and supervision. Your expertise in cloud-based infrastructure, ETL pipeline development, and financial domains will be crucial in creating robust, scalable, and efficient solutions. Key Responsibilities: - Lead the development of advanced models using AWS services such as EMR, Glue, and Glue Notebooks. - Design, build, and optimize scalable cloud infrastructure solutions with a minimum of 5 years of experience. - Create, manage, and optimize ETL pipelines using PySpark for large-scale data processing. - Build and maintain CI/CD pipelines for...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

kolkata, all india

On-site

As a Data Engineer at our company, you will play a crucial role in designing, developing, and optimizing data pipelines to ensure seamless integration and transformation of data across various systems. Your expertise in Python, SQL, and AWS will be essential in building scalable and secure data solutions. Here is a breakdown of the key responsibilities, required technical skills, and professional attributes for this role: Key Responsibilities: - Design, build, and manage robust and scalable data pipelines for both batch and real-time processing. - Develop ETL processes to acquire, transform, and integrate data from multiple sources. - Build and maintain data warehouses, data lakes, and stora...

Posted 1 month ago

AI Match Score
Apply

10.0 - 14.0 years

0 Lacs

chennai, all india

On-site

As a Data Engineer at Tanla, you will be responsible for developing, deploying, monitoring, and maintaining ETL Jobs and all data engineering and pipeline activities. Your expertise in SQL queries and DB solutions will be crucial for designing and constructing enterprise procedure constructs. Here is a breakdown of your key responsibilities: - Design and construct enterprise procedure constructs using any ETL tool, preferably PentahoDI - Provide accurate work estimates and manage efforts across multiple lines of work - Design and develop exception handling and data cleansing/standardization procedures - Gather requirements from various stakeholders related to ETL automation - Design and crea...

Posted 1 month ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

hyderabad, all india

On-site

As a Software Developer at this company in Hyderabad, India, your role will involve developing, updating, and maintaining new and existing applications. You will ensure that these applications meet specified requirements, scale efficiently, and maintain high performance. Your responsibilities will include: - Analyzing and interpreting project requirements to independently design effective solutions while considering the broader product architecture. - Designing, developing, and deploying APIs and web services with a focus on reusable, testable, and efficient code. - Implementing low-latency, scalable applications with optimized performance. - Creating Docker files for containerization and de...

Posted 1 month ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

all india, gurugram

On-site

You have a job opportunity where you will be required to have strong proficiency in Python for data processing and ETL tasks. Additionally, you need to possess advanced SQL skills, including query optimization, indexing, joins, and analytical functions. Hands-on experience with ClickHouse, MongoDB, Redis, and ElasticSearch will be beneficial for this role. Your key responsibilities will include: - Working with Apache Spark/PySpark and handling data lakes effectively - Utilizing ETL and data ingestion tools such as Apache NiFi efficiently - Familiarizing yourself with messaging and streaming platforms like Kafka, RabbitMQ, and ActiveMQ - Implementing workflow orchestration using frameworks li...

Posted 1 month ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

pune, all india

On-site

As a Data Engineer in the Financial Crime Compliance (FCC) domain, your role is crucial in building and maintaining scalable, secure, and performant data infrastructure to support AML, KYC, transaction monitoring, and regulatory reporting workflows. Your collaboration with compliance analysts, product teams, and data scientists will ensure the availability of clean, reliable, and timely data for advanced analytics and operational reporting within the FCC space. **Key Responsibilities:** - **Data Pipeline Development:** Design, develop, and manage data pipelines for ingesting, transforming, and delivering financial and customer data. - **Domain-Specific Data Modeling:** Develop data models to...

Posted 1 month ago

AI Match Score
Apply

5.0 - 8.0 years

7 - 11 Lacs

pune

Work from Office

Job Description Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new feat...

Posted 1 month ago

AI Match Score
Apply

3.0 - 5.0 years

5 - 9 Lacs

bengaluru

Work from Office

Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert re...

Posted 1 month ago

AI Match Score
Apply

10.0 - 12.0 years

8 - 12 Lacs

pune

Work from Office

Job Description Skill - Data Science , ELK STack , AIOps Experience: 10+ Years Required Skills & Experience Data Science & Machine Learning experience: Hands-on proficiency in Python, TensorFlow, PyTorch, Scikit-learn, Pandas, NumPy. Extensive knowledge of ETL techniques: Data extraction, transformation, and loading using Apache Airflow, Apache NiFi, Spark or similar tools Observability Stack: Hands-on experience with Prometheus, Grafana, ELK Stack, Loki, OpenTelemetry, Jaeger, or Zipkin. Experience with Time-Series Analysis, Predictive Analytics and AI-driven Observability. Cloud & Infrastructure: Experience with AWS, Azure, or GCP observability services (e.g., CloudWatch, Azure Monitor). D...

Posted 1 month ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies