AI Engineer
This is a hands-on engineering role requiring solid backend experience, deep curiosity about AI, and the ability to translate cutting-edge research into scalable, real-world solutions.
Key Responsibilities
- Design and build
AI-powered microservices and APIs
using Python and .NET 9+. - Architect and implement
MCP servers
and AI integration frameworks for multi-model orchestration. - Collaborate with data scientists to
train, evaluate, and deploy
machine learning and LLM-based models into production environments. - Build
backend systems and RESTful APIs
for AI-driven applications and integrations with internal and external systems. - Develop and maintain pipelines for
model versioning, testing, and CI/CD automation
. - Ensure security, scalability, and observability of deployed AI systems in
Azure
or multi-cloud environments. - Work with
vector databases
, embeddings
, and retrieval-augmented generation (RAG)
setups. - Integrate
third-party AI APIs
(e.g., OpenAI, Anthropic, Azure AI) while optimizing cost and performance. - Stay ahead of the curve on AI trends (LLMs, multimodal models, MCP, LangChain, semantic caching, agent frameworks).
- Document architecture and contribute to internal AI best practices and engineering standards.
Required Qualifications & Skills
Education:
Bachelor’s or Master’s degree in Computer Science, Mathematics, or Physics
(or equivalent experience).
Experience:
- 4–7 years of experience as a
backend engineer
, ideally with exposure to AI or data-intensive systems. - Strong background in
API development, integration, and distributed systems
. - Proven hands-on experience with
Python
and .NET 9 or higher
. - Knowledge of
Golang
is a plus. AI/ML Skills:
- Understanding of
LLM architectures
, embeddings, and prompt-engineering workflows. - Experience deploying or integrating
AI models
into production applications. - Familiarity with
machine learning pipelines
and libraries (e.g., PyTorch, TensorFlow, Scikit-learn). MCP Expertise:
- Experience in building and managing
Model Context Protocol (MCP) servers
, ensuring interoperability between AI models and backend systems. Cloud & DevOps:
- Hands-on experience with
Azure services
, Docker/Kubernetes, and CI/CD pipelines (Azure DevOps, GitHub Actions, etc.). Collaboration:
- Strong communication and teamwork skills; ability to partner with data scientists, backend developers, and product stakeholders.
Mindset:
- Avid learner who actively follows the latest developments in
AI frameworks, open-source models, and developer tools
.
Nice to Have
- Experience with
LangChain
, LlamaIndex
, or agentic orchestration frameworks
. - Familiarity with
vector databases
(Pinecone, Milvus, Redis Vector). - Experience integrating
observability tools
for model monitoring and feedback loops. - Prior experience in the
hospitality, travel, or fintech
industry. - Exposure to
event-driven architectures
(Kafka, RabbitMQ, Service Bus).
Why Join IOL
At IOL, you’ll work on transformative technology shaping the future of hospitality and travel. You’ll have the freedom to experiment with cutting-edge AI tools, the support of world-class engineers, and the opportunity to bring intelligent systems into a real, large-scale production environment.
Join us to shape the future of AI-driven travel technology.