974 Apache Airflow Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

20 - 30 Lacs

gurugram

Hybrid

Role & responsibilities Team Player and Self-Starter, experienced in : Apache Airflow, Apache Spark, Snowflake, SQL, ETL concepts, data quality assurance Build and manage end-to-end data workflows using Apache Airflow and Spark. Design ETL processes to support a variety of analytics and business needs. Oversee data validation and quality controls leveraging SQL and Snowflake. Optimize pipeline performance for speed and reliability.

Posted 3 days ago

AI Match Score
Apply

5.0 - 8.0 years

9 - 13 Lacs

bengaluru

Work from Office

Strong Python programming skills. Experience in workflow orchestration using Apache Airflow. Experience in productionizing machine learning models in Domino run-time environment. Experience in building data pipelines using Python and PySpark. Strong SQL skills and experience in extracting data from different databases (Hive, SnowFlake, etc.), performing data transformation, etc. Preferable experience working with Hadoop ecosystem - HDFS, Hive. Experience working with a job scheduler, preferably Control-M. Experience with Unix-based command line interface and Bash scripts. Experience in developing and querying APIs - Restful, OpenAPI, etc. Strong understanding of DevOps and CI/CD principles; ...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

6 - 10 Lacs

bengaluru

Work from Office

Must have 4+ years of IT experience, with at least 2 years of relevant experience in Snowflake. In-depth understanding of Data Warehousing, ETL concepts, and modeling structure principles. Experience working with Snowflake functions, hands-on experience with Snowflake utilities, stage and file upload features, time travel, fail safe, procedure writing, tasks, Snowpipe, and SnowSQL. Knowledge of Snowflake Architecture. Good knowledge of Advanced SQL with joins, views/materialized views, various functions, dimensional data modeling, and ETL procedures using SQL. Experience in Python functional programming is a must, with Pandas, Apache Airflow, etc. Basics of ML & Gen AI. Expertise in engineer...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 9 Lacs

chennai

Work from Office

Strong proficiency in Python and SQL for data manipulation and processing. Experience with data warehouse solutions such as Snowflake , BigQuery , and Databricks . Ability to design and implement efficient data models for data lakes and warehouses. Familiarity with CI/CD pipelines and automation tools to streamline data engineering workflows. Deep understanding of principles in data warehousing and cloud architecture for building efficient and scalable data systems. Experience with Apache Airflow and/or AWS MWAA. Experience with Snowflakes distinctive features, including multi-cluster architecture and shareable data features. Expertise in distributed processing frameworks like Apache Spark o...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

3 - 7 Lacs

pune

Work from Office

Desired Skills AWS Data Modelling Python Key Technical Skills & Responsibilities Design, develop, and implement ETL processes to extract, transform, and load data into Snowflake. Utilize DBT for efficient and scalable data transformations to create structured JSON, ensuring data quality and integrity. Implement and manage slowly changing dimensions to support evolving data requirements. Collaborate with cross-functional teams to understand data needs and design solutions that align with business objectives. Optimize and fine-tune data pipelines for performance and scalability. Troubleshoot and resolve data-related issues, ensuring the reliability of the data infrastructure. Stay updated on i...

Posted 3 days ago

AI Match Score
Apply

5.0 - 10.0 years

20 - 27 Lacs

hosur, ahmedabad, bengaluru

Hybrid

We are looking for an experienced Senior Data Engineer to join our team in Bengaluru / Hosur/Ahmedabad. Someone who can help to build scalable, reliable, and secure Data analytic solutions. Skills Required: 5+ years in data engineering and Microsoft Azure. Experience in implementing Data Lake with technologies like Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database A comprehensive foundation with working knowledge of Azure Full stack, Event Hub & Streaming Analytics. A passion for writing high-quality code and the code should be modular, scalable, and free of bugs (debugging skills in SQL, Python, or Scala/Java). Enthuse to collaborate with various stakeholders across th...

Posted 3 days ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

gurugram, all india

On-site

As a Consultant Engineer for the Liquidity Program based in India, Gurugram, you will play a crucial role in designing and constructing Liquidity calculations using the bespoke Data Calculation Platform (DCP) according to documented business requirements. Your responsibilities will include: - Working within a dedicated squad to ingest data from producers and implement essential liquidity calculations on a cutting-edge data platform. - Ensuring the delivery of a high-performing, robust, and stable platform that meets the business needs of internal and external stakeholders. - Bringing in-depth knowledge of big data technologies and a strong desire to work in a DevOps environment with end-to-e...

Posted 3 days ago

AI Match Score
Apply

3.0 - 7.0 years

11 - 15 Lacs

gurugram

Work from Office

Consultant, Data Analytics & Reporting - Food Job Purpose and Impact The Analytics Engineer will prepare business relevant information and reduce time to insight by leading technical activities that enable information capture, business intelligence and analytics competency. In this role, you will ensure teams have the data needed to make timely, accurate and actionable insights by investing in the core capabilities of data management, data engineering and key decision support applications that consume data. You will be a key partner between business needs and data engineering and will deliver the final information product including the dashboards that enable business value. Key Accountabilit...

Posted 4 days ago

AI Match Score
Apply

6.0 - 9.0 years

8 - 11 Lacs

noida

Remote

Must Have : Experience in analyzing, integrating, modelling, and interpreting large volumes of complex data from multiple sources and technologies Minimum of 2+ years of working experience with Snowflake as a cloud Datawarehouse and data lake service Experience in advanced snowflake topics like RBAC, Dynamic tables, optimization techniques Obvious track record of designing and implementing ELT data pipelines using tools like dbt with Snowflake as Cloud Datawarehouse Working experience in building lean and efficient data transformation pipelines using dbt (data build tool) at advanced level Exposed to ETL/ELT and data governance tools (incl. FiveTran, Alation) Strong technical proficiency in ...

Posted 4 days ago

AI Match Score
Apply

5.0 - 10.0 years

12 - 17 Lacs

bengaluru

Hybrid

Job Title: Data Engineer (NiFi / Kafka / Airflow / ELK / SQL / Power BI) Experience: 5+ Years Location: Bangalore Work Mode: Hybrid Job Description We are looking for an experienced Data Engineer to design, develop, and manage real-time and batch data pipelines within our Digital Platforms team. Responsibilities Build and optimize data pipelines using Apache NiFi, Kafka, and Airflow Develop and maintain ETL/ELT processes ensuring high data quality Manage data integration across systems using NiFi flows and Kafka topics Work with the ELK Stack (Elasticsearch, Logstash, Kibana) for monitoring and analytics Write efficient and optimized SQL queries for data extraction and reporting Develop Powe...

Posted 4 days ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

kochi, kerala

On-site

Role Overview: As an AI Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and infrastructure crucial for AI and machine learning projects. Your role will involve bridging traditional data engineering with specific AI requirements to ensure high-quality data preparation and efficient data flow into AI and GenAI applications. This is a full-time on-site position based at the office in Infopark, Kochi. Key Responsibilities: - Build, test, and maintain scalable data pipelines for AI and machine learning workflows. - Develop and manage architected data solutions to support generative and predictive AI use cases. - Automate data acquisition, transform...

Posted 4 days ago

AI Match Score
Apply

2.0 - 5.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Custom Software Engineer Project Role Description : Develop custom software solutions to design, code, and enhance components across systems or applications. Use modern frameworks and agile practices to deliver scalable, high-performing solutions tailored to specific business needs. Must have skills : Python (Programming Language) Good to have skills : Java Minimum 5 year(s) of experience is required Educational Qualification : 15 year emp , Summary As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project ...

Posted 5 days ago

AI Match Score
Apply

5.0 - 8.0 years

8 - 17 Lacs

mumbai, hyderabad, bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Airflow Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating efficient data pipelines, ensuring the integrity and quality of...

Posted 5 days ago

AI Match Score
Apply

8.0 - 12.0 years

0 - 2 Lacs

hyderabad

Remote

Technical Requirements Core Stack Python 3.11+ and Django 4.2+ with Django REST Framework PostgreSQL with multi-tenant architecture (schema-per-tenant isolation) Celery + Redis for asynchronous task processing and caching Django Channels for WebSocket/real-time features JWT authentication and role-based access control (RBAC) Apache Airflow (DAGs, scheduling, workflow automation) Key Skills Building RESTful APIs with DRF (serializers, viewsets, authentication) Complex database design and ORM optimization Workflow orchestration using Apache Airflow Distributed task queues and background job processing WebSocket consumers for real-time updates Multi-tenant SaaS architecture patterns Git version...

Posted 5 days ago

AI Match Score
Apply

5.0 - 8.0 years

8 - 17 Lacs

mumbai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Airflow Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating efficient data pipelines, ensuring the integrity and quality of...

Posted 6 days ago

AI Match Score
Apply

6.0 - 11.0 years

10 - 14 Lacs

pune

Work from Office

Job Overview The Senior Software Engineer will do the design, development, and delivery of advanced big data solutions within the Hadoop ecosystem. This role involves architecting and implementing scalable data processing frameworks, ensuring high performance and reliability across distributed systems. The engineer will research and evaluate emerging technologies to optimize data workflows and meet evolving business needs. Additionally, the position requires hands-on expertise in Cloudera distribution and related tools, ensuring seamless orchestration and integration of data pipelines. The Senior Software Engineer will also be responsible for maintaining operational excellence, meeting SLA r...

Posted 6 days ago

AI Match Score
Apply

2.0 - 7.0 years

11 - 15 Lacs

pune

Work from Office

Title and Summary The Software Engineer II will contribute to the design, development, and maintenance of big data solutions within the Hadoop ecosystem The role involves working with distributed data processing frameworks and supporting the creation of reliable, high-performance data pipelines The engineer will be responsible for hands-on development and troubleshooting using the Cloudera distribution, along with related ecosystem tools such as Apache Ozone, Apache Iceberg, Apache Airflow, and Apache NiFi The position requires hands on experience in Apache Spark to support large-scale, massively parallel data processing tasks The Software Engineer II will collaborate closely with senior tea...

Posted 6 days ago

AI Match Score
Apply

3.0 - 6.0 years

7 - 11 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering,Bachelor Of Technology Service Line Cloud & Infrastructure Services Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to T...

Posted 1 week ago

AI Match Score
Apply

1.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : Apache Spark, Microservices and Light Weight Architecture, Apache Airflow Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your typ...

Posted 1 week ago

AI Match Score
Apply

8.0 - 12.0 years

60 - 80 Lacs

noida

Work from Office

Design/ manage cloudnative ML platforms Build ML/ETL pipelines - Apache Airflow, distributed data workflows -Apache Spark Containerize/ deploy ML workloads Develop CI/CD pipelines Implement ML observability using CloudWatch, Grafana, Prometheus Required Candidate profile Strong MLOps profile 8+ yrs of DevOps 4+ yrs in MLOps/ ML pipeline 4+ yrs in Apache Airflow 4+ yrs in Apache Spark Strong in AWS services - EKS/ECS, Lambda, Kinesis, S3, CloudWatch Must - Python

Posted 1 week ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

Role Overview: As a Project Leader at BCN Labs, you will be a key player in delivering cutting-edge analytical solutions by utilizing your data engineering expertise and analytical problem-solving skills. Your role will involve working on robust data platform engineering, software development, and client-oriented delivery, requiring a combination of hands-on implementation capabilities and strategic thinking to address real-world business challenges. You will collaborate with analysts, data scientists, and business stakeholders to frame problems, validate solutions, and lead teams in client delivery. Key Responsibilities: - Architect and Deliver Scalable Data Pipelines: Build, optimize, and ...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

delhi

On-site

**Role Overview:** You will be a hands-on Data Engineering Lead with 6+ years of experience and a proven track record of managing at least 2 Data Engineers. Your primary mission will be to lead a small team, write code, design pipelines, and ensure timely delivery of clean, lineage-rich data to power analytics, AI, and customer-facing products. **Key Responsibilities:** - Lead a small data engineering team - Manage and mentor 2+ Data Engineers by setting clear goals, providing feedback, and conducting regular 1:1s - Recruit, onboard, and develop the team through code reviews, pairing sessions, and personalized learning plans - Take ownership of ingestion and integration processes - Perform s...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Role Overview: You are expected to have at least 3+ years of professional experience in building and operating production-grade applications and services across the stack, including frontend, backend, and databases. Your strong programming skills in Python and/or Scala and SQL will enable you to write modular, testable, and well-documented code for batch and streaming workloads. You should be well-versed in modern data engineering stacks, including distributed processing tools like Apache Spark (preferably Databricks), PySpark/Scala, orchestration tools such as Azure Data Factory or Apache Airflow, and event-driven patterns with Azure Functions/Logic Apps. Your expertise should also cover st...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Platform Engineer at our company, you will be responsible for designing, building, and maintaining scalable data infrastructure using AWS cloud services. Your expertise in Python, PySpark, EMR, and Apache Airflow will be crucial in developing robust data pipelines and analytics solutions to drive business insights. Key Responsibilities: - Design and implement scalable data pipelines using Apache Airflow - Build and optimize AWS EMR clusters for big data processing - Develop data processing applications using Python and PySpark - Create ETL workflows for data ingestion and transformation - Monitor and troubleshoot data platform performance - Collaborate with data scientists and anal...

Posted 1 week ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Python Web Scraper at HiLabs, you will be a key player in designing and building scalable and reliable web scraping solutions using Python/PySpark. Your responsibilities will include developing enterprise-grade scraping services, working with large volumes of structured and unstructured data, implementing robust data validation and monitoring processes, and optimizing data workflows for performance and scalability. You will also collaborate with data scientists, analysts, and engineers to integrate data from disparate sources and ensure smooth data flow between systems. **Key Responsibilities:** - Design and build scalable, reliable web scraping solutions using Python/PySpark. - Develop...

Posted 1 week ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies