About Delta Tech Hub:
Delta Air Lines (NYSE: DAL) is the U.S. global airline leader in safety, innovation, reliability and customer experience. Powered by our employees around the world, Delta has for a decade led the airline industry in operational excellence while maintaining our reputation for award-winning customer service. With our mission of connecting the people and cultures of the globe, Delta strives to foster understanding across a diverse world and serve as a force for social good. Delta has fast emerged as a customer-oriented, innovation-led, technology-driven business. The Delta Technology Hub will contribute directly to these objectives. It will sustain our long-term aspirations of delivering niche, IP-intensive, high-value, and innovative solutions. It supports various teams and functions across Delta and is an integral part of our transformation agenda, working seamlessly with a global team to create memorable experiences for customers.
Responsibilities:
Data Pipeline Development and Maintenance:
- Design, build, and optimize scalable ETL/ELT pipelines to ingest data from diverse sources such as APIs, cloud platforms, and databases.
- Ensure pipelines are robust, efficient, and capable of handling large volumes of data.
Data Integration and Harmonization:
- Implement data transformation and enrichment processes to support analytics and reporting needs.
Data Quality and Monitoring:
- Develop and implement data validation and monitoring frameworks to ensure data accuracy and consistency.
- Troubleshoot and resolve issues related to data quality, latency, or performance.
Collaboration with Stakeholders:
- Partner with cross functional teams, analysts, and data scientists to understand data requirements and translate them into technical solutions.
- Provide technical support and guidance on data-related issues or projects.
Tooling and Automation:
- Leverage cloud-based solutions and frameworks (e.g., AWS) to streamline processes and enhance automation.
- Maintain and optimize existing workflows while continuously identifying opportunities for improvement.
Documentation and Best Practices:
- Document pipeline architecture, data workflows, and processes for both technical and non-technical audiences.
- Follow industry best practices for version control, security, and data governance.
Continuous Learning and Innovation:
- Stay current with industry trends, tools, and technologies in data engineering and marketing analytics.
- Recommend and implement innovative solutions to improve the scalability and efficiency of data systems.
What you need to succeed (minimum qualifications):
- Bachelor of Science degree in Computer Science or equivalent
- Extensive experience with databases and data platforms (AWS preferred)
- 2+ Hands-on experience in designing, implementing, managing large scale data and ETL solutions utilizing AWS Compute, Storage and database services (S3, Lambda, Redshift, Glue, Athena etc.,)
- Proficiency in Python, SQL, PySpark
- 2-3 years of post-degree professional experience as a data engineer developing and maintaining data pipelines
- Experience in Data Quality, Data Modeling, Data Analytics/BI, Data Enrichment, Security and Governance.
- Understanding of concepts such as normalization, SCD (Slowly changing dimensions) and CDC (Change data capture)
- Experience in working on streaming event platforms such as Kafka/Kinesis
- Strong knowledge of relational and non-relational databases
- Proficiency in DBT for data transformation and modeling.
- Good understanding of data warehouses, ETL/ELT, AWS architecture (using Glue, SQS, SNS, S3, step functions etc.,)
- Strong understanding of orchestration tools such as Airflow
- Ability to create clean, well-designed code and systems
- Proven ability to work with large and complex datasets
- Strong analytical and programming skills with the ability to solve data-related challenges efficiently
- Strong attention to detail and a commitment to data accuracy.
- Proven ability to learn new data models quickly and apply them effectively in a fast-paced environment.
- Excellent communication skills with the ability to present complex data findings to both technical and non-technical audiences.
- Ability to work collaboratively in a team environment.
Behavioral Competencies:
- Ability to work in collaborative environments and embrace diverse perspectives.??
- Communicate clearly and concisely, express thoughts and ideas effectively, and embrace cultural differences with respect when engaging with others.?
- Ability to engage effectively with peers and stakeholders to build strong partnerships.?
- Prioritize, maintain focus, and consistently deliver commitment.?
- Proactively understand customer expectations and willingness to create customer-based solutions.?
What will give you a competitive edge (preferred qualifications):
- Experience working with AWS to develop data pipelines
- AWS certifications: Solution Architect or Developer Associate
- Experience migrating data pipelines and systems to modern cloud-based solutions