10894 Airflow Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 10 Lacs

hyderabad, telangana, india

On-site

Key Responsibilities: Design, build, and maintain data pipelines (ETL/ELT) using BigQuery , Python , and SQL Optimize data flow, automate processes, and scale infrastructure Develop and manage workflows in Airflow/Cloud Composer and Ascend (or similar ETL tools) Implement data quality checks and testing strategies Support CI/CD (DevSecOps) processes, conduct code reviews, and mentor junior engineers Collaborate with QA/business teams and troubleshoot issues across environments Core Skills: BigQuery , Python , SQL , Airflow/Cloud Composer , Ascend or similar ETL tools Data integration, warehousing, and pipeline orchestration Data quality frameworks and incremental load strategies Strong exper...

Posted 1 week ago

AI Match Score
Apply

6.0 - 9.0 years

6 - 9 Lacs

hyderabad, telangana, india

On-site

Python Proficiency : Strong understanding of Python, with practical coding experience AWS: Comprehensive knowledge of AWS services and their applications Airflow : creating and managing Airflow DAG scheduling. Unix & SQL : Solid command of Unix commands, shell scripting, and writing efficient SQL scripts Analytical & Troubleshooting Skills : Exceptional ability to analyze data and resolve complex issues. Development Tasks : Proven capability to execute a variety of development activities with efficiency Insurance Domain Knowledge: Familiarity with the Insurance sector is highly advantageous. Production Data Management : Significant experience in managing and processing production data Work S...

Posted 1 week ago

AI Match Score
Apply

6.0 - 10.0 years

6 - 10 Lacs

hyderabad, telangana, india

On-site

Role & responsibilities Bachelors degree in computer science, engineering, or a related field. Master's degree preferred. Data: 5+ years of experience with data analytics and data warehousing. Sound knowledge of data warehousing concepts. SQL: 5+ years of hands-on experience on SQL and query optimization for data pipelines. ELT/ETL: 5+ years of experience in Informatica/ 3+ years of experience in IICS/IDMC Migration Experience: Experience Informatica on prem to IICS/IDMC migration Cloud: 5+ years experience working in AWS cloud environment Python: 5+ years of hands-on experience of development with Python Workflow: 4+ years of experience in orchestration and scheduling tools (e.g. Apache Air...

Posted 1 week ago

AI Match Score
Apply

5.0 - 8.0 years

5 - 8 Lacs

hyderabad, telangana, india

On-site

Python Proficiency : Strong understanding of Python, with practical coding experience AWS: Comprehensive knowledge of AWS services and their applications Airflow : creating and managing Airflow DAG scheduling. Unix & SQL : Solid command of Unix commands, shell scripting, and writing efficient SQL scripts Analytical & Troubleshooting Skills : Exceptional ability to analyze data and resolve complex issues. Development Tasks : Proven capability to execute a variety of development activities with efficiency Insurance Domain Knowledge: Familiarity with the Insurance sector is highly advantageous. Production Data Management : Significant experience in managing and processing production data Work S...

Posted 1 week ago

AI Match Score
Apply

0.0 years

0 Lacs

bengaluru, karnataka

On-site

Bengaluru, Karnataka, India Job Type Full Time About the Role Skillset - AWS architecture & optimization (S3, Redshift, Glue, Lambda), advanced Python/SQL, designing scalable pipelines, data modeling, orchestration tools (Airflow/Step Functions), API integrations Education Any Engineering, Any graduation Requirements Skillset - AWS architecture & optimization (S3, Redshift, Glue, Lambda), advanced Python/SQL, designing scalable pipelines, data modeling, orchestration tools (Airflow/Step Functions), API integrations Education Any Engineering, Any graduation About the Company

Posted 1 week ago

AI Match Score
Apply

0 years

0 Lacs

trivandrum, kerala, india

On-site

Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes...

Posted 1 week ago

AI Match Score
Apply

3.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: As a Senior Google Cloud Architect in Pune (Hybrid) with over 10 years of experience, including 3+ years specifically on GCP, you will play a crucial role in leading the design and delivery of comprehensive cloud solutions on Google Cloud Platform. Your responsibilities will involve collaborating with data engineering, DevOps, and architecture teams to create scalable, secure, and cost-effective cloud platforms. Key Responsibilities: - Designing scalable data and application architectures utilizing tools such as BigQuery, Dataflow, Composer, Cloud Run, Pub/Sub, and other related GCP services. - Leading cloud migration, modernization, and CI/CD automation through the use of tec...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, your primary responsibility will be designing and maintaining scalable ML model deployment infrastructure using Kubernetes and Docker. You will need to implement CI/CD pipelines for ML workflows, ensure adherence to security best practices, and set up monitoring tools for tracking system health, model performance, and data pipeline issues. Collaboration with cross-functional teams to optimize the end-to-end lifecycle of data products and identify performance bottlenecks and data reliability issues in the ML infrastructure will also be part of your role. Key Responsibilities: - Design and maintain scalable ML model deployment infrastructure using Kubernetes and Docker -...

Posted 1 week ago

AI Match Score
Apply

5.0 years

0 Lacs

india

Remote

Role : Data Engineer Location: Remote(Banglore,Chennai,Pune) Job type: Full time Pay : 14LPA - 18 LPA(Based on Experience) Timings : A couple of hours overlap with EST, as the client is Canada-based (till 12AM IST) Start Date : 20th October 2025 Job Description: We are seeking an experienced Data Engineer with strong expertise in scheduling/orchestration tools (Autosys, Airflow), Python scripting, and Google Cloud Platform (GCP) to support a critical data engineering initiative. The role focuses on building, optimizing, and automating data pipelines in GCP using BigQuery and DAG orchestration. The ideal candidate is hands-on, proactive, and eager to leverage modern cloud tools to deliver sca...

Posted 1 week ago

AI Match Score
Apply

2.0 - 5.0 years

0 Lacs

gurgaon, haryana, india

On-site

Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleaguesat all levelswill invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and e...

Posted 1 week ago

AI Match Score
Apply

4.0 - 6.0 years

0 Lacs

gurgaon, haryana, india

On-site

Job Overview: We are seeking a dynamic Consultant to join our data and analytics team, delivering innovative solutions with a focus on the life sciences industry. The ideal candidate will bring current, hands-on expertise in data warehousing (Snowflake, Redshift, Databricks or similar), master data management (MDM), and report development (Power BI, Tableau, Sigma or similar), leveraging cloud platforms (AWS, Azure, GCP). This role involves leading a small team of 2-3 developers, actively contributing to technical delivery, and engaging with clients in an onshore/offshore model. We are particularly excited to find someone passionate about applying Generative AI (Gen AI) to transform the life...

Posted 1 week ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra, india

On-site

Skill Set: AWS, Snowflake, Kafka, Airflow, GitHub, PySpark Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines Ingest data from various sources (APIs, databases, files, etc.) Implement both real-time and batch processing solutions based on use case requirements Ensure data quality through validation and cleansing processes Collaborate with Product Managers and Business Stakeholders to gather and understand data requirements Translate business needs into technical specifications Ensure data security, access control, and compliance with relevant policies Maintain documentation and follow best practices for data engineering Ideal candidate profile: 4-8 years of hands-...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Data Engineer at Macquarie, you will have the exciting opportunity to implement the group's data strategy by leveraging cutting-edge technology and cloud services. If you are passionate about working in the private markets space and being part of one of Macquarie's most successful global divisions, then this role is tailor-made for you. At Macquarie, we thrive on diversity and empowerment, enabling our global team to shape limitless possibilities with 56 years of unbroken profitability in 31 markets. Join our friendly and supportive team where everyone's ideas contribute to driving outcomes. **Key Responsibilities:** - Design and manage data pipelines utilizing Python, SQL, and tools li...

Posted 1 week ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer, your role is crucial for ensuring the smooth operation of the Data Platform in Azure/AWS Databricks. You will be responsible for the continuous development, enhancement, support, and maintenance of data availability, data quality, performance enhancement, and stability of the system. **Key Responsibilities:** - Designing and implementing data ingestion pipelines from various sources using Azure Databricks - Ensuring the efficient and smooth running of data pipelines and adhering to security, regulatory, and audit control guidelines - Driving optimization, continuous improvement, and efficiency in data processes **Qualifications Required:** - Minimum of 5 years of experien...

Posted 1 week ago

AI Match Score
Apply

12.0 - 16.0 years

0 Lacs

hyderabad, telangana

On-site

Role Overview: You are an experienced Snowflake Architect with over 12 years of experience in data warehousing, cloud architecture, and Snowflake implementations. Your expertise lies in designing, optimizing, and managing large-scale Snowflake data platforms to ensure scalability, performance, and security. You are expected to possess deep technical knowledge of Snowflake, cloud ecosystems, and data engineering best practices. Key Responsibilities: - Lead the design and implementation of Snowflake data warehouses, data lakes, and data marts. - Define best practices for Snowflake schema design, clustering, partitioning, and optimization. - Architect multi-cloud Snowflake deployments with seam...

Posted 1 week ago

AI Match Score
Apply

5.0 - 10.0 years

0 Lacs

pune, maharashtra, india

On-site

Job Title: Snowflake & DBT Data Engineer Experience: 510 years Employment Type: Full-time Job Summary: We are seeking a highly skilled Data Engineer with strong hands-on experience in Snowflake and dbt (Data Build Tool) to join our data engineering team. The ideal candidate will be responsible for designing and developing scalable data pipelines, performing advanced data transformations, and ensuring data quality using modern data stack technologies. Key Responsibilities: Design, develop, and optimize data pipelines using dbt and Snowflake . Build efficient, reliable, and scalable data transformation models with dbt Core or dbt Cloud . Implement Snowflake features such as Snowpipe, Streams, ...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at DataNimbus, you will play a crucial role in shaping a sustainable, AI-driven future by working with cutting-edge technologies and revolutionizing workflows in Data+AI solutions. Your contributions will be trusted by global businesses for their scalability, security, and efficiency, allowing you to grow personally and professionally in a culture that values curiosity and continuous learning. **Key Responsibilities:** - Design, develop, and maintain a highly scalable data warehouse solution in BigQuery. - Architect data models, schemas, and partitioning strategies for efficient query performance. - Build and maintain ETL/ELT pipelines using tools such as Dataflow, Datapro...

Posted 1 week ago

AI Match Score
Apply

4.0 - 6.0 years

0 Lacs

ambala, haryana, india

On-site

Senior Data Engineer Azure, Airflow & PySpark We're hiring a Senior Data Engineer (69 years of experience) to lead the modernization of our data ecosystem from legacy platforms (Azure Synapse, SQL Server) to a next-gen Microsoft Fabric lakehouse. If you bring deep Azure expertise, strong Airflow orchestration skills, and at least 4 years of hands-on PySpark experience, this role is for you. Key Responsibilities Drive the migration of data pipelines and stored procedures into Microsoft Fabric using PySpark, Delta Lake, and OneLake. Build and orchestrate scalable workflows with Apache Airflow and Azure Data Factory (ADF). Redesign and optimize legacy ADF pipelines for seamless Fabric integrati...

Posted 1 week ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a Machine Learning Operation Engineer at Onclusive, your primary focus will be on deploying, scaling, and optimizing backend algorithms, data ingestion pipelines, machine learning services, and data platforms. You will be working with vast amounts of text and analytics data to solve challenging marketing problems using your technical knowledge and Big Data analytics skills. Your role is crucial to the success of Onclusive. - Designing and building scalable machine learning services and data platforms - Utilizing benchmarks and metrics to measure and improve services - Managing systems processing tens of millions of jobs per day - Researching and implementing cutting-edge algorithms - Coll...

Posted 1 week ago

AI Match Score
Apply

5.0 - 8.0 years

0 Lacs

india

Remote

Job Title: Senior Lead Data Engineer Location: 100% remote Duration: 12 months with possible extension Working time zone: Full EST hours Working hours: 8 hours per day (40 hours per week) Job Description: Introduction: The Senior Lead Data Engineer plays a crucial role in the portfolio analytics team, focusing on the transition of code development and migration from SAS to Databricks. This position is vital for enhancing data processing capabilities and optimizing performance within the organization, directly impacting the efficiency and accuracy of analytics operations. Roles and Responsibilities: Lead the migration of code and processes from SAS to Databricks, ensuring seamless integration...

Posted 1 week ago

AI Match Score
Apply

0 years

0 Lacs

pune, maharashtra, india

On-site

Selected Intern’s Day-to-day Responsibilities Include Work on advanced AI products related to LLM, Generative AI, Prompt Engineering, and RAG. AI-Powered UI Development: You will build dynamic and responsive user interfaces in Angular that serve as the primary interaction point for our AI services. Use of Ai tools like N8n, Claude Coding, and Airflow to build AI Agents. Use of advance APIs from Google Gemini, Cloud Based LLM, and Private LLM deployment to build Generative AI-based systems. API Integration: You will expertly integrate and consume RESTful APIs from our AI backend, ensuring seamless data flow and real-time model interaction within the Angular application. Prompt Engineering Imp...

Posted 1 week ago

AI Match Score
Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Quality Engineer at PepsiCo, your primary role will be to apply best practices for data quality and triaging to reduce data downtimes on both existing and new pipelines for our business, customer, and Data Science team. **Key Responsibilities:** - Own data quality from existing data pipelines and develop an end-to-end quality framework covering outages, freshness, accuracy, and issue reporting. - Define best practices for quality development, engineering, and coding as a vital member of our world-class engineering team. - Collaborate with the product team to leverage core technology platforms such as Direct Commerce, Supply Chain, Marketing Automation, Mobile, and Data, contributin...

Posted 1 week ago

AI Match Score
Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Python Engineer at NTT DATA in Bangalore, Karnataka (IN-KA), India, you will be a valuable member of the C3 Data Warehouse team. You will play a crucial role in developing the next-gen data platform that consolidates data from various technology systems into a centralized data platform, enabling reporting and analytics solutions for the Technology Risk functions at Morgan Stanley. **Key Responsibilities:** - Develop components in Python for the unified data pipeline framework - Establish best practices for optimal Snowflake usage - Assist in testing and deployment using standard frameworks and CI/CD tooling - Monitor query performance and data loads - Provide guidance during QA &...

Posted 1 week ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: As a Database Designer / Senior Data Engineer at VE3, your primary responsibility will be to architect and design modern, scalable data platforms on AWS and/or Azure. You will ensure the implementation of best practices for security, cost optimization, and performance. Your tasks will involve developing detailed data models, documenting data dictionaries, and lineage to support data solutions. Moreover, you will be involved in building and optimizing ETL/ELT pipelines using languages such as Python, SQL, Scala, and services like AWS Glue, Azure Data Factory, Spark, and Airflow. Key Responsibilities: - Collaborate closely with data analysts, BI teams, and stakeholders to transl...

Posted 1 week ago

AI Match Score
Apply

5.0 - 8.0 years

0 Lacs

bengaluru, karnataka, india

On-site

InMobi Advertising is a global technology leader helping marketers win the moments that matter. Our advertising platform reaches over 2 billion people across 150+ countries and turns real-time context into business outcomes, delivering results grounded in privacy-first principles. Trusted by 30,000+ brands and leading publishers, InMobi is where intelligence, creativity, and accountability converge. By combining lock screens, apps, TVs, and the open web with AI and machine learning, we deliver receptive attention, precise personalization, and measurable impact. Through Glance AI, we are shaping AI Commerce, reimagining the future of e-commerce with inspiration-led discovery and shopping. Des...

Posted 1 week ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies