Bengaluru
INR 35.0 - 45.0 Lacs P.A.
Work from Office
Full Time
Seeking skilled & experienced Technical Manager to oversee the successful planning#Lead end-to-end solution delivery, from requirements gathering to implementation#Coordinate with engineering, product management, & QA teams, ensure seamless execution Required Candidate profile Strong technical background#Experience in software engineering, Solution delivery, & Project management#Knowledge on Agile methodologies#Certifications in PMP, Scrum Master, ITIL & RTE certification.
Bengaluru
INR 35.0 - 45.0 Lacs P.A.
Work from Office
Full Time
Architect to design the core technology stack, services, & infrastructure#Define scalable, reliable, and high-performance system architectures#Create reusable components, API#Collaborate with DevOps teams to drive CI/CD. Required Candidate profile Require experience in designing & scaling large enterprise products/platforms with at least 5 years in architectural leadership roles. Certifications such as TOGAF, AWS Certified Solutions Architect.
Bengaluru
INR 35.0 - 45.0 Lacs P.A.
Work from Office
Full Time
Title: AI/ML Architect Location: Onsite Bangalore Experience: 10+ years Position Summary: We are seeking an experienced AI/ML Architect to lead the design and deployment of scalable AI solutions. This role requires a strong blend of technical depth, systems thinking, and leadership in machine learning , computer vision , and real-time analytics . You will drive the architecture for edge, on-prem, and cloud-based AI systems, integrating 3rd party data sources, sensor and vision data to enable predictive, prescriptive, and autonomous operations across industrial environments. Key Responsibilities: Architecture & Strategy Define the end-to-end architecture for AI/ML systems including time series forecasting , computer vision , and real-time classification . Design scalable ML pipelines (training, validation, deployment, retraining) using MLOps best practices. Architect hybrid deployment models supporting both cloud and edge inference for low-latency processing. Model Integration Guide the integration of ML models into the IIoT platform for real-time insights, alerting, and decision support. Support model fusion strategies combining disparate data sources, sensor streams with visual data (e.g., object detection + telemetry + 3rd party data ingestion). MLOps & Engineering Define and implement ML lifecycle tooling, including version control, CI/CD, experiment tracking, and drift detection. Ensure compliance, security, and auditability of deployed ML models. Collaboration & Leadership Collaborate with Data Scientists, ML Engineers, DevOps, Platform, and Product teams to align AI efforts with business goals. Mentor engineering and data teams in AI system design, optimization, and deployment strategies. Stay ahead of AI research and industrial best practices; evaluate and recommend emerging technologies (e.g., LLMs, vision transformers, foundation models). Must-Have Qualifications: Bachelor’s or Master’s degree in Computer Science, AI/ML, Engineering, or a related technical field. 8+ years of experience in AI/ML development, with 3+ years in architecting AI solutions at scale. Deep understanding of ML frameworks (TensorFlow, PyTorch), time series modeling, and computer vision. Proven experience with object detection, facial recognition, intrusion detection , and anomaly detection in video or sensor environments. Experience in MLOps (MLflow, TFX, Kubeflow, SageMaker, etc.) and model deployment on Kubernetes/Docker . Proficiency in edge AI (Jetson, Coral TPU, OpenVINO) and cloud platforms (AWS, Azure, GCP). Nice-to-Have Skills: Knowledge of stream processing (Kafka, Spark Streaming, Flink). Familiarity with OT systems and IIoT protocols (MQTT, OPC-UA). Understanding of regulatory and safety compliance in AI/vision for industrial settings. Experience with charts, dashboards, and integrating AI with front-end systems (e.g., alerts, maps, command center UIs). Role Impact: As AI/ML Architect, you will shape the intelligence layer of our IIoT platform — enabling smarter, safer, and more efficient industrial operations through AI. You will bridge research and real-world impact , ensuring our AI stack is scalable, explainable, and production-grade from day one.
Bengaluru
INR 40.0 - 45.0 Lacs P.A.
Work from Office
Full Time
Title: Principal Architect Location: Onsite Bangalore Experience: 15+ years in software & data platform architecture and technology strategy Role Overview We are seeking a Platform & Application Architect to lead the design and implementation of a next-generation, multi-domain data platform and its ecosystem of applications. In this strategic and hands-on role, you will define the overall architecture, select and evolve the technology stack, and establish best practices for governance, scalability, and performance. Your responsibilities will span across the full data lifecycle—ingestion, processing, storage, and analytics—while ensuring the platform is adaptable to diverse and evolving customer needs. This role requires close collaboration with product and business teams to translate strategy into actionable, high-impact platform & products. Key Responsibilities 1. Architecture & Strategy Design the end-to-end architecture for a On-prem / hybrid data platform (data lake/lakehouse, data warehouse, streaming, and analytics components). Define and document data blueprints, data domain models, and architectural standards. Lead build vs. buy evaluations for platform components and recommend best-fit tools and technologies. 2. Data Ingestion & Processing Architect batch and real-time ingestion pipelines using tools like Kafka, Apache NiFi, Flink, or Airbyte. Oversee scalable ETL/ELT processes and orchestrators (Airflow, dbt, Dagster). Support diverse data sources: IoT, operational databases, APIs, flat files, unstructured data. 3. Storage & Modeling Define strategies for data storage and partitioning (data lakes, warehouses, Delta Lake, Iceberg, or Hudi). Develop efficient data strategies for both OLAP and OLTP workloads. Guide schema evolution, data versioning, and performance tuning. 4. Governance, Security, and Compliance Establish data governance , cataloging , and lineage tracking frameworks. Implement access controls , encryption , and audit trails to ensure compliance with DPDPA, GDPR, HIPAA, etc. Promote standardization and best practices across business units. 5. Platform Engineering & DevOps Collaborate with infrastructure and DevOps teams to define CI/CD , monitoring , and DataOps pipelines. Ensure observability, reliability, and cost efficiency of the platform. Define SLAs, capacity planning, and disaster recovery plans. 6. Collaboration & Mentorship Work closely with data engineers, scientists, analysts, and product owners to align platform capabilities with business goals. Mentor teams on architecture principles, technology choices, and operational excellence. Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of experience in software engineering, including 5+ years in architectural leadership roles. Proven expertise in designing and scaling distributed systems, microservices, APIs, and event-driven architectures using Java, Python, or Node.js. Strong hands-on experience with building scalable data platforms on premise/Hybrid/cloud environments. Deep knowledge of modern data lake and warehouse technologies (e.g., Snowflake, BigQuery, Redshift) and table formats like Delta Lake or Iceberg. Familiarity with data mesh, data fabric, and lakehouse paradigms. Strong understanding of system reliability, observability, DevSecOps practices, and platform engineering principles. Demonstrated success in leading large-scale architectural initiatives across enterprise-grade or consumer-facing platforms. Excellent communication, documentation, and presentation skills, with the ability to simplify complex concepts and influence at executive levels. Certifications such as TOGAF or AWS Solutions Architect (Professional) and experience in regulated domains (e.g., finance, healthcare, aviation) are desirable.
Bengaluru
INR 20.0 - 30.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Analyst Location: Bangalore Experience: 6+ years Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Master’s degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 5+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.
Hyderabad
INR 30.0 - 40.0 Lacs P.A.
Work from Office
Full Time
Job Description Kafka/Integration Architect Position brief: Kafka/Integration Architect is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions. This role involves configuring, deploying, and monitoring Kafka clusters to ensure the high availability and scalability of data streaming services. The Kafka Architect collaborates with cross-functional teams to integrate Kafka into various applications and ensures optimal performance and reliability of the data infrastructure. Kafka/Integration Architect play a critical role in driving data-driven decision-making and enabling real-time analytics, contributing directly to the company’s agility, operational efficiency and ability to respond quickly to market changes. Their work supports key business initiatives by ensuring that data flows seamlessly across the organization, empowering teams with timely insights and enhancing the customer experience. Location: Hyderabad Primary Role & Responsibilities: Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency. Collaborate with development teams to integrate Kafka into applications and services. Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights. Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance. Optimize Kafka configurations for performance, reliability, and scalability. Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead. Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications. Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration. Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends. Required Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to translate business requirements into technical solutions. Working Experience and Qualification: Education: Bachelor’s or master’s degree in computer science, Information Technology or related field. Experience: Proven experience of 8-10 years as a Kafka Architect or in a similar role. Skills: Strong knowledge of Kafka architecture, including brokers, topics, partitions and replicas. Experience with Kafka security, including SSL, SASL, and ACLs. Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments. Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink. Solid understanding of distributed systems, data streaming and messaging patterns. Proficiency in Java, Scala, or Python for Kafka-related development tasks. Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging. Experience with tools like Zookeeper, Schema Registry, and Kafka Connect. Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment. Experience with cloud platforms like AWS, Azure, or GCP. Preferred Skills: (Optional) Kafka certification or related credentials, such as: Confluent Certified Administrator for Apache Kafka (CCAAK) Cloudera Certified Administrator for Apache Kafka (CCA-131) AWS Certified Data Analytics – Specialty (with a focus on streaming data solutions) Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with other messaging systems like RabbitMQ or Apache Pulsar. Experience with data serialization formats like Avro, Protobuf, or JSON. Company Profile: WAISL is an ISO 9001:2015, ISO 20000-1:2018, ISO 22301:2019 certified, and CMMI Level 3 Appraised digital transformation partner for businesses across industries with a core focus on aviation and related adjacencies. We transform airports and relative ecosystems through digital interventions with a strong service excellence culture. As a leader in our chosen space, we deliver world-class services focused on airports and their related domains, enabled through outcome-focused next-gen digital/technology solutions. At present, WAISL is the primary technology solutions partner for Indira Gandhi International Airport, Delhi, Rajiv Gandhi International Airport, Hyderabad, Manohar International Airport, Goa, Kannur International Airport, Kerala, and Kuwait International Airport, and we expect to soon provide similar services for other airports in India and globally. WAISL, as a digital transformation partner, brings proven credibility in managing and servicing 135+Mn passengers, 80+ airlines, core integration, deployment, and real-time management experience of 2000+ applications vendor agnostically in highly complex technology converging ecosystems. This excellence in managed services delivered by WAISL has enabled its customer airports to be rated amongst the best-in-class service providers by Skytrax and ACI awards, and to win many innovation and excellence awards
Hyderabad
INR 15.0 - 25.0 Lacs P.A.
Work from Office
Full Time
Senior Java Developer Experience / Position / Location : 6 - 10 years of Experience / # of Positions 3 / Hyderabad Must Have Skills: Integration: Java, SpringBoot, and REST APIs, and Spring Cloud BackEnd: MongoDB / Oracle / MSSQL & MySQL / PostgreSQL Testing: JUnit & Exposure to Selenium DevOps: Git, Jenkins, Dockersnot Cloud & Infrastructure: AWS (ECS, EKS, Lambda, S3) Containerization & Orchestration: Docker, Kubernetes Messaging & Streaming: Kafka, MQTT Monitoring & Logging: Grafana, Datadog, Elasticsearch Position Brief Problem-solving and critical thinking skills with a focus on building scalable, reliable, and high-performance systems. Strong verbal and written communication skills to explain complex technical concepts to non-technical stakeholders. Collaborative team player with the ability to lead technical discussions and provide direction. Deep understanding and experience working in Scrum or Kanban agile frameworks. Confluence for project tracking and documentation. Strong understanding of agile principles and tools like Jira Roles and Responsibilities Team Management: Lead and mentor junior developers, fostering a culture of knowledge sharing and technical growth within the team. Requirement Management: Communicate effectively with team members and stakeholders to understand requirements and deliver solutions. Collaborate with cross-functional teams, including product managers, UX/UI designers, and DevOps engineers, to define, design, and deliver new features in a fast-paced environment. Guide the Business Analysts and Testing teams for Use Case & Test Case development Development: Engage actively in agile development processes, including sprint planning, stand-ups, and retrospectives. Design, Develop, and Maintain web applications within the WAISL Application ecosystem. Build and enhance backend services for application integration and efficient data management and real-time processing. Participate in code reviews to ensure high-quality code and adherence to best practices. Testing: Ensure that Application is testing across Unit, Integration, Regression, VAPT & UAT stages Education : Bachelors degree in computer science, Engineering, or a related field (or equivalent experience) Other (Good to Have): Domain: Airport / Airline / Aviation experience FrontEnd: AngularJS, Integration: NodeJS, Kafka / IBM Queue Manager, Python BackEnd: MongoDB, Cassandra, and ElasticSearch, RDS, GraphL, and JSON Web Tokens (JWT)
Bengaluru
INR 40.0 - 60.0 Lacs P.A.
Work from Office
Full Time
Location: Bangalore Onsite Experience: 12+ years Type: Full-time --- Role Overview We are looking for a Technical Program Manager (TPM) to drive the execution of a next-generation data and AI platform that powers real-time analytics, machine learning, and industrial applications across multiple domains such as aviation, logistics, and manufacturing. You will work at the intersection of engineering, product, architecture, and business, managing the roadmap, resolving technical dependencies, and ensuring delivery of critical platform components across cross-functional and geographically distributed teams. --- Key Responsibilities Program & Execution Management Drive end-to-end delivery of platform features and sector-specific solutions by coordinating multiple scrum teams (AI/ML, Data, Fullstack, DevOps). Develop and maintain technical delivery plans, sprint milestones, and program-wide timelines. Identify and resolve cross-team dependencies, risks, and technical bottlenecks. Technical Fluency & Architecture Alignment Understand the platform’s architecture (Kafka, Spark, data lakes, ML pipelines, hybrid/on-prem deployments) and guide teams toward cohesive delivery. Translate high-level product goals into detailed technical milestones and backlog items in collaboration with Product Owners and Architects. Cross-Functional Collaboration Liaise between globally distributed engineering teams, product owners, architects, and domain stakeholders to align on priorities and timelines. Coordinate multi-sector requirements and build scalable components that serve as blueprints across industries (aviation, logistics, etc.). Governance & Reporting Maintain clear, concise, and timely program reporting (dashboards, OKRs, status updates) for leadership and stakeholders. Champion delivery best practices, quality assurance, and documentation hygiene. Innovation & Agility Support iterative product development with flexibility to handle ambiguity and evolving priorities. Enable POCs and rapid prototyping efforts while planning for scalable production transitions. ---Required Skills & Qualifications 12+ years of experience in software engineering and technical program/project management. Strong understanding of platform/data architecture, including event streaming (Kafka), batch/stream processing (Spark, Flink), and AI/ML pipelines. Proven success delivering complex programs in agile environments with multiple engineering teams. Familiarity with DevOps, cloud/on-prem infrastructure (AWS, Azure, hybrid models), CI/CD, and observability practices. Excellent communication, stakeholder management, and risk mitigation skills. Strong grasp of Agile/Scrum or SAFe methodologies. --- Good-to-Have Experience working in or delivering solutions to industrial sectors such as aviation, manufacturing, logistics, or utilities. Experience with tools like Jira, Confluence, Notion, Asana, or similar. Background in engineering or data (Computer Science, Data Engineering, AI/ML, or related).
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.