Designation: Senior Engineer Data Science and Engineering
Location: 22nd Floor Tower C Building No - 5 DLF Epitome Gurgaon Haryana.
Tbo.com(www.tbo.com)
TBO is a global platform that aims to simplify all buying and selling travel needs of travel partners across the world. The proprietary technology platform aims to simplify the demands of the complex world of global travel by seamlessly connecting the highly distributed travel buyers and travel suppliers at scale.
The TBO journey began in 2006 with a simple goal to address the evolving needs of travel buyers and suppliers, and what started off as a single product air ticketing company, has today become the leading B2A (Business to Agents) travel portal across the Americas, UK & Europe, Africa, Middle East, India, and Asia Pacific.
Today, TBOs product range from air, hotels, rail, holiday packages, car rentals, transfers, sightseeing, cruise, and cargo. Apart from these products, our proprietary platform relies heavily on AI/ML to offer unique listings and products, meeting specific requirements put forth by customers, thus increasing conversions.
TBOs approach has always been technology-first and we continue to invest on new innovations and new offerings to make travel easy and simple. TBOs travel APIs are serving large travel ecosystems across the world while the modular architecture of the platform enables new travel products while expanding across new geographies.
Why TBO:
You will influence & contribute to Building World Largest Technology Led Travel Distribution Network for a $ 9 Trillion global travel business market.
We are the emerging leaders in technology led end-to-end travel management, in the B2B space.
Physical Presence in 47 countries with business in 110 countries.
We are notching up our Gross Transaction Volume (GTV) in several billions and growing much faster than the industry growth rate; backed by a proven and well-established business model.
We are reputed for our-long lasting trusted relationships. We stand by our eco system of suppliers and buyers to service the end customer.
An open & informal start-up environment which cares.
What TBO offers to a Life Traveler in You:
Chance to work with CXO Leaders. Our leadership come from top IITs and IIMs; or have led significant business journeys for top brands Indian and global brands.
Enhance Your Leadership Acumen. Join the journey to create global scale and World Best.
Challenge Yourself to do something path breaking. Be Empowered. The only thing to stop you will be your imagination.
Travel space is likely to see significant growth. Witness and shape this space. It will be one exciting journey.
Own a wide portfolio of our Platform Business, India. Primary focus will be on top talent attraction, retention, development, and engagement. Talent Acquisition, Business HR, HR Operations & Leaning will report in apart from relevant COE functions connected to these domains.
Key Responsibilities:
- 
Design, develop, and maintain data pipelines and ETL workflows using Apache Spark and AWS Glue.
- 
Build and optimize large-scale data lake solutions leveraging HDFS, Apache Hudi, and AWS S3.
- 
Develop and manage data ingestion frameworks, transformations, and data modeling for analytical workloads.
- 
Enable data discovery and querying using AWS Athena and other analytical tools.
- 
Implement and monitor data quality, security, and governance practices across the data platform.
- 
Work closely with DevOps and Cloud teams to ensure data infrastructure is reliable, secure, and cost-efficient.
Required Skills & Experience:
- 
5–8 years of hands-on experience in Data Engineering or related roles.
- 
Strong proficiency in Apache Spark (PySpark/Scala) and AWS Glue.
- 
Solid understanding of HDFS, Data Lake architectures, and distributed data processing.
- 
Hands-on experience with Apache Hudi for incremental data processing and time-travel capabilities.
- 
Experience in AWS ecosystem — S3, Athena, Lambda, IAM, EMR, etc.
- 
Strong SQL skills and experience with query optimization.
- 
Good understanding of data modeling, ETL design patterns, and data lifecycle management.
- 
Familiarity with version control (Git) and CI/CD processes for data pipelines.
- 
Exposure to building an end-to-end data science solution is a plus