10863 Airflow Jobs - Page 48

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

9 - 18 Lacs

hyderabad

Hybrid

7+ Years of expertise required in below skillsets: Data Engineering, Python, SQL, PySpark, Cloud (Any), Airflow, DevOps & Orchestration, Agile, Scrum, Data Pipeline Development, Stakeholder Management, Banking/ Telecom Domain Experience.

Posted 1 week ago

AI Match Score
Apply

10.0 years

0 Lacs

gurugram, haryana, india

On-site

Company Description At WorkFox Solutions, we bridge the gap between top talent and forward-thinking companies, helping both grow and thrive. We offer end-to-end recruitment services, executive search, talent acquisition strategies, HR consulting, and personalized career counseling. Our approach combines human intuition with data-driven insights to build trust, transparency, and long-term partnerships. Whether you're an employer searching for the perfect fit or a job seeker looking for your next opportunity, we are here to support your journey. Role Description We are seeking a Senior Consultant – AI & Machine Learning with 8–10 years of experience to provide strategic guidance, technical exp...

Posted 1 week ago

AI Match Score
Apply

0 years

0 Lacs

ahmedabad, gujarat, india

Remote

LLM Orchestration & RAG Development (LangChain/LlamaIndex/PydanticAI Focus) Architect complex LangChain pipelines for multi-agent financial workflows Build production RAG systems using LlamaIndex for financial document retrieval Implement agents with strong type safety and structured outputs Design and implement: Chain-of-thought reasoning for financial analysis Dynamic prompt routing based on query complexity Memory management for long-running financial conversations Tool integration for agents to access GL, bank feeds, and operational data Optimise token usage and response latency for real-time WhatsApp interactions API Development & Integration (FastAPI Focus) Build high-performance FastA...

Posted 1 week ago

AI Match Score
Apply

7.0 years

0 Lacs

hyderabad, telangana, india

Remote

🌟 We're Hiring: PySpark Data Engineer! 🌟 We are seeking an experienced PySpark Data Engineer to design, develop, and maintain large-scale data processing systems. The ideal candidate will have expertise in Apache Spark, Python, and big data technologies to build robust data pipelines and analytics solutions. 📍 Location: Hyderabad, India ⏰ Work Mode: Flexible office & remote 💼 Role: PySpark Data Engineer What You'll Do 🎯 Design and develop scalable data pipelines using PySpark 📊 Optimize data processing workflows for performance and reliability 🔧 Build ETL processes for large datasets across multiple sources ☁️ Deploy and maintain data solutions on cloud platforms 📈 Collaborate with data scie...

Posted 1 week ago

AI Match Score
Apply

6.0 years

15 - 20 Lacs

india

On-site

Design, develop, and deploy ML models for Agentic AI use cases. Work with AWS AI/ML ecosystem (SageMaker, Bedrock, Lambda, Step Functions, S3, DynamoDB, Kinesis). Preprocess and engineer features from structured, unstructured, and streaming data. Collaborate with data engineers to ensure high-quality, well-curated training datasets. Implement LLM fine-tuning, embeddings, and retrieval-augmented generation (RAG) pipelines. Evaluate and optimize models for accuracy, performance, scalability, and cost-efficiency. Integrate models into production applications and APIs. Work with MLOps teams to automate training, testing, deployment, and monitoring workflows. Perform experimentation, A/B testing,...

Posted 1 week ago

AI Match Score
Apply

5.0 years

0 Lacs

kochi, kerala, india

On-site

Role Description Key Responsibilities Design, develop, and optimize ETL pipelines using PySpark on Google Cloud Platform (GCP). Work with BigQuery, Cloud Dataflow, Cloud Composer (Apache Airflow), and Cloud Storage for data transformation and orchestration. Develop and optimize Spark-based ETL processes for large-scale data processing. Implement best practices for data governance, security, and monitoring in a cloud environment. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. Troubleshoot performance bottlenecks and optimize Spark jobs for efficient execution. Automate data workflows using Apache Airflow or Cloud Composer. Ensure data qua...

Posted 1 week ago

AI Match Score
Apply

8.0 years

0 Lacs

kochi, kerala, india

On-site

Role Description Job Title: Data Engineer – AWS & PySpark Location: Trivandrum/Kochi/Chennai/Hyderabad/Bangalore/Pune/Noida Experience Required: 5–8 years Employment Type: Full-Time Department: Data Engineering / Technology Job Summary: We are seeking a skilled Data Engineer with 5–8 years of experience to design, implement, and maintain robust, scalable data architectures on AWS . The ideal candidate will be highly proficient in PySpark and experienced with AWS cloud data services including S3, Glue, and Redshift . You will work closely with cross-functional teams to enable seamless data flows and ensure efficient ETL pipelines across our cloud infrastructure. Key Responsibilities: Design, ...

Posted 1 week ago

AI Match Score
Apply

8.0 years

0 Lacs

kochi, kerala, india

On-site

Role Description Job Title: Data Engineer – AWS & PySpark Location: Trivandrum/Kochi/Chennai/Hyderabad/Bangalore/Pune/Noida Experience Required: 5–8 years Employment Type: Full-Time Department: Data Engineering / Technology Job Summary We are seeking a skilled Data Engineer with 5–8 years of experience to design, implement, and maintain robust, scalable data architectures on AWS . The ideal candidate will be highly proficient in PySpark and experienced with AWS cloud data services including S3, Glue, and Redshift . You will work closely with cross-functional teams to enable seamless data flows and ensure efficient ETL pipelines across our cloud infrastructure. Key Responsibilities Design, im...

Posted 1 week ago

AI Match Score
Apply

0 years

0 Lacs

kochi, kerala, india

On-site

Job Description We are looking for a seasoned Senior Data Engineer with a passion for building scalable, reliable, and cutting-edge data solutions on cloud platforms. The ideal candidate will bring deep expertise in Google Cloud Platform (GCP), BigQuery, and modern data engineering practices, including experience with ingestion and transformation tools, as well as proficiency in medallion architecture to drive data quality and governance. Experience with Databricks on AWS is highly valued and considered a distinct advantage. As a key member of our data engineering team, you will architect and implement end-to-end data pipelines that power actionable business intelligence, advanced analytics,...

Posted 1 week ago

AI Match Score
Apply

0 years

0 Lacs

trivandrum, kerala, india

On-site

Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes...

Posted 1 week ago

AI Match Score
Apply

8.0 years

0 Lacs

trivandrum, kerala, india

On-site

Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes...

Posted 1 week ago

AI Match Score
Apply

8.0 years

0 Lacs

trivandrum, kerala, india

On-site

Role Description Job Title: Jr. ML Engineer Experience Range: 8+ years Must Have Skills Programming: Strong expertise in Python or R Applied Machine Learning: Problem framing – ability to choose between Supervised, Self-supervised, or Reinforcement Learning (RL) Data wrangling – experience with Weak/Distant Supervision, Pseudo-labelling, strong EDA, data preparation, labelling, and data augmentation End-to-end modelling in ML, DL, and RL Experience with Single models, Ensembles, Mixture of Experts Understanding of Mathematical Induction, Tree Induction, Deep Learning fundamentals, and optimization algorithms like SGD Transfer learning – N-shot learning (or variants), fine-tuning skills ML/DL...

Posted 1 week ago

AI Match Score
Apply

6.0 years

0 Lacs

trivandrum, kerala, india

On-site

Role Description Experience Range: 6-12 years Location: Trivandrum, Kochi, Bangalore Mandatory Skills Programming Languages: Scala, Spark, PySpark, Python, SQL Big Data Technologies: Hadoop, Hive, Pig, MapReduce ETL & Data Engineering: Data Warehouse Design, ETL, Data Analytics, Data Mining, Data Cleansing Cloud Platforms: GCP, Azure Tools & Frameworks: Apache Hadoop, Airflow, Kubernetes, Containers Other Skills: Data pipeline creation, optimization, troubleshooting, and data validation Work Experience: 6+ years in Data Warehouse and Big Data technologies 4+ years of hands-on experience with Scala, Spark, PySpark, Python, and SQL 3+ years in strategic data planning, governance, and standard ...

Posted 1 week ago

AI Match Score
Apply

8.0 years

0 Lacs

pune, maharashtra, india

On-site

Key Responsibilities End-to-End Product Ownership: Lead the development of predictive models from exploration and prototype to full-scale production deployment. Prediction: Build robust regression and/or time-series and/or deep learning models to predict prices/values of financial assets, oil, apparel, and other commodities. Model Optimization: Continuously monitor and fine-tune models for accuracy, performance, and scalability using real-time data feedback. ML Ops & Deployment: Collaborate with engineering to ensure successful deployment and monitoring of models in production environments. Stakeholder Collaboration: Translate business problems into analytical frameworks, working closely wit...

Posted 1 week ago

AI Match Score
Apply

0.0 - 1.0 years

0 Lacs

t nagar, chennai, tamil nadu

On-site

Experience: 2 to 3 Years Employment Type: Full-time Shift: US Timings Location : T.Nagar, Chennai. Work Mode: On-site / 5 days a week. Job Description: We are seeking a motivated DevOps Engineer with hands-on experience in ETL tools , DevOps practices , SQL , Python , and AWS . The ideal candidate will be responsible for automating workflows, supporting data pipelines, and ensuring system reliability in a fast-paced environment. Key Responsibilities: Develop and manage CI/CD pipelines and infrastructure automation. Work with ETL tools to build, schedule, and monitor data workflows. Write and optimize SQL queries for data extraction and reporting. Develop Python scripts for automation and int...

Posted 1 week ago

AI Match Score
Apply

5.0 years

0 Lacs

hyderabad, telangana, india

On-site

Experience: 5-10years Location: Hyderabad Google Cloud + Ingestion: Experience on BigQuery, Cloud Storage or equivalent cloud platforms Knowledge of BigQuery ingress and egress patterns Experience in writing Airflow DAGs Knowledge of pubsub,dataflow or any declarative data pipeline tools using batch and streaming ingestion Other GCP Services: Vertex AI, Model Registry, Secret Manager, KMS, Composer, KubeFlow, Container Registry, Artefact Registry, Cloud Build, Cloud Run, OAuth2.0, Scheduler, GKE, Model Registry, MIG, Cloud Function, Pub/Sub Extensive experience in Google Cloud Platform (GCP) and related services (e.g. IAM, BigQuery, cloud storage, functions, compute). Creating data models an...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

0 Lacs

gurugram, haryana, india

On-site

About Company: Our Client Corporation provides digital engineering and technology services to Forbes Global 2000 companies worldwide. Our Engineering First approach ensures we can execute all ideas and creatively solve pressing business challenges. With industry expertise and empowered agile teams, we prioritize execution early in the process for impactful results. We combine logic, creativity and curiosity to build, solve, and create. Every day, we help clients engage with new technology paradigms, creatively building solutions that solve their most pressing business challenges and move them to the forefront of their industry. Job Title : GCP Big Data Engineer Key Skills : PySpark , Airflow...

Posted 1 week ago

AI Match Score
Apply

5.0 years

0 Lacs

chennai, tamil nadu, india

On-site

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, an...

Posted 1 week ago

AI Match Score
Apply

7.0 - 12.0 years

7 - 11 Lacs

bengaluru, karnataka, india

On-site

Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

5 - 10 Lacs

hyderabad, telangana, india

On-site

Key Responsibilities: Design, build, and maintain data pipelines (ETL/ELT) using BigQuery , Python , and SQL Optimize data flow, automate processes, and scale infrastructure Develop and manage workflows in Airflow/Cloud Composer and Ascend (or similar ETL tools) Implement data quality checks and testing strategies Support CI/CD (DevSecOps) processes, conduct code reviews, and mentor junior engineers Collaborate with QA/business teams and troubleshoot issues across environments Core Skills: BigQuery , Python , SQL , Airflow/Cloud Composer , Ascend or similar ETL tools Data integration, warehousing, and pipeline orchestration Data quality frameworks and incremental load strategies Strong exper...

Posted 1 week ago

AI Match Score
Apply

6.0 - 9.0 years

6 - 9 Lacs

hyderabad, telangana, india

On-site

Python Proficiency : Strong understanding of Python, with practical coding experience AWS: Comprehensive knowledge of AWS services and their applications Airflow : creating and managing Airflow DAG scheduling. Unix & SQL : Solid command of Unix commands, shell scripting, and writing efficient SQL scripts Analytical & Troubleshooting Skills : Exceptional ability to analyze data and resolve complex issues. Development Tasks : Proven capability to execute a variety of development activities with efficiency Insurance Domain Knowledge: Familiarity with the Insurance sector is highly advantageous. Production Data Management : Significant experience in managing and processing production data Work S...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

6 - 10 Lacs

hyderabad, telangana, india

On-site

Role & responsibilities Bachelors degree in computer science, engineering, or a related field. Master's degree preferred. Data: 5+ years of experience with data analytics and data warehousing. Sound knowledge of data warehousing concepts. SQL: 5+ years of hands-on experience on SQL and query optimization for data pipelines. ELT/ETL: 5+ years of experience in Informatica/ 3+ years of experience in IICS/IDMC Migration Experience: Experience Informatica on prem to IICS/IDMC migration Cloud: 5+ years experience working in AWS cloud environment Python: 5+ years of hands-on experience of development with Python Workflow: 4+ years of experience in orchestration and scheduling tools (e.g. Apache Air...

Posted 1 week ago

AI Match Score
Apply

5.0 - 8.0 years

5 - 8 Lacs

hyderabad, telangana, india

On-site

Python Proficiency : Strong understanding of Python, with practical coding experience AWS: Comprehensive knowledge of AWS services and their applications Airflow : creating and managing Airflow DAG scheduling. Unix & SQL : Solid command of Unix commands, shell scripting, and writing efficient SQL scripts Analytical & Troubleshooting Skills : Exceptional ability to analyze data and resolve complex issues. Development Tasks : Proven capability to execute a variety of development activities with efficiency Insurance Domain Knowledge: Familiarity with the Insurance sector is highly advantageous. Production Data Management : Significant experience in managing and processing production data Work S...

Posted 1 week ago

AI Match Score
Apply

0.0 years

0 Lacs

bengaluru, karnataka

On-site

Bengaluru, Karnataka, India Job Type Full Time About the Role Skillset - AWS architecture & optimization (S3, Redshift, Glue, Lambda), advanced Python/SQL, designing scalable pipelines, data modeling, orchestration tools (Airflow/Step Functions), API integrations Education Any Engineering, Any graduation Requirements Skillset - AWS architecture & optimization (S3, Redshift, Glue, Lambda), advanced Python/SQL, designing scalable pipelines, data modeling, orchestration tools (Airflow/Step Functions), API integrations Education Any Engineering, Any graduation About the Company

Posted 1 week ago

AI Match Score
Apply

0 years

0 Lacs

trivandrum, kerala, india

On-site

Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes...

Posted 1 week ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies