General Summary:
As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Hardware Engineer, you will plan, design, optimize, verify, and test electronic systems, bring-up yield, circuits, mechanical systems, Digital/Analog/RF/optical systems, equipment and packaging, test systems, FPGA, and/or DSP systems that launch cutting-edge, world class products. Qualcomm Hardware Engineers collaborate with cross-functional teams to develop solutions and meet performance requirements.
Minimum Qualifications:
- Bachelor's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 6+ years of Hardware Engineering or related work experience.
- OR
- Master's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 5+ years of Hardware Engineering or related work experience.
- OR
- PhD in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 4+ years of Hardware Engineering or related work experience.
Role Overview
As an Airflow Infrastructure & Automation Engineer , you will design, scale, and maintain Apache Airflow workflow orchestration systems that enable critical automation for our CPU development lifecycle. From silicon design verification to performance analysis, youll ensure that our pipelines are highly available, secure, and optimized for large-scale compute environments , supporting the engineering teams that deliver world-class CPU technology.
What Youll Do
- Manage Multi-Instance Airflow Deployments: Bring up and manage multiple Apache Airflow instances across different data centers , ensuring high availability, security, and performance.
- Custom Development: Develop custom Airflow DAGs, operators, providers, and plugins to meet specialized workflow requirements.
- Build & Maintain: Design, optimize, and scale Airflow infrastructure and pipelines for CPU design, verification, and performance workflows.
- Platform Engineering: Deploy and manage Airflow on Kubernetes or other distributed environments for scalability and fault tolerance.
- Integrate: Connect Airflow with EDA tools, simulation environments, and HPC clusters .
- Automate: Streamline repetitive engineering tasks and enable CI/CD for pipeline deployments .
- Collaborate: Work closely with CPU architects, verification engineers, and DevOps teams to ensure seamless orchestration across the design ecosystem.
What Were Looking For
Technical Skills:
- Strong Python programming skills .
- Experience managing multiple Apache Airflow instances in production .
- Experience developing workflows, pipelines, and DAGs with Apache Airflow , including scaling and performance tuning.
- Familiarity with Kubernetes , containerization (Docker), and distributed systems.
- Knowledge of cloud platforms (AWS/GCP/Azure) and infrastructure-as-code tools (Terraform, Helm).
- DevOps experience , including CI/CD pipeline design and automation.
- Experience managing GitHub repositories and implementing GitHub Actions for CI/CD workflows.
- SQL proficiency for data integration and workflow orchestration.
- Experience with FastAPI for building APIs and React for front-end integration.
- Familiarity with EDA workflows, HPC clusters , or large-scale compute environments is a plus.
Nice-to-Have
- Experience with Airflow on Kubernetes or managed Airflow solutions (MWAA, GCP Composer).
- Familiarity with EDA tools (Synopsys, Cadence) or silicon design flows.
- Knowledge of security best practices for workflow orchestration systems.