Home
Jobs

Apache Airflow Architect

9 - 14 years

35 - 40 Lacs

Posted:3 weeks ago| Platform: Naukri logo

Apply

Skills Required

Work Mode

Work from Office

Job Type

Full Time

Job Description

Join AiRo Digital Labs and build your career with a leader in emerging digital technologies such as robotic process automation, conversational AI, machine learning, the internet of things, voice-based technologies, and cloud enablement. At AiRo, we offer you competitive benefits and compensation packages along with the opportunity to learn on the job, develop knowledge of the process, and grow your career. Whats more, you will have fun as you solve some of the most complex business problems. Responsibilities: Design and develop DAGs: Create and maintain DAGs to orchestrate workflows, including data loading, transformation, and reporting. Implement ETL jobs: Develop and deploy ETL jobs for data extraction, transformation, and loading. Manage and configure workflows: Create, manage, and configure data pipelines, ensuring their efficiency and reliability. Ensure data quality: Implement data validation processes to maintain data integrity. Write custom operators, sensors, and hooks: Develop custom components for Airflow to extend its functionality and integrate with specific tools and systems. Collaborate with teams: Work with other data engineers and stakeholders to understand business needs and translate them into Airflow workflows. Contribute to a GitHub-driven environment: Follow coding standards and contribute to a collaborative development process. Write unit and end-to-end tests: Ensure the reliability and quality of Airflow pipelines. Work with data sources and storage: Integrate Airflow with various databases, cloud services, and data lakes. Years of Experience 9+ Years. Skills Required: Proficiency in Python: Strong programming skills in Python, the language used for Airflow. Understanding of Airflow architecture and concepts: Knowledge of Airflows components, DAGs, operators, and scheduling mechanisms. Experience with SQL and database design: Familiarity with SQL for data manipulation and database design for managing data pipelines. Experience with ETL processes: Understanding of ETL principles and best practices. Knowledge of data warehousing and data lakes: Familiarity with data warehousing concepts and data lake technologies. Familiarity with version control systems: Experience with Git and other version control systems for code management.

Mock Interview

Practice Video Interview with JobPe AI

Start Architect Interview Now

My Connections Airo Digital Labs

Download Chrome Extension (See your connection in the Airo Digital Labs )

chrome image
Download Now

RecommendedJobs for You

Noida, Pune, Bengaluru