Seeking skilled & experienced Technical Manager to oversee the successful planning#Lead end-to-end solution delivery, from requirements gathering to implementation#Coordinate with engineering, product management, & QA teams, ensure seamless execution Required Candidate profile Strong technical background#Experience in software engineering, Solution delivery, & Project management#Knowledge on Agile methodologies#Certifications in PMP, Scrum Master, ITIL & RTE certification.
Architect to design the core technology stack, services, & infrastructure#Define scalable, reliable, and high-performance system architectures#Create reusable components, API#Collaborate with DevOps teams to drive CI/CD. Required Candidate profile Require experience in designing & scaling large enterprise products/platforms with at least 5 years in architectural leadership roles. Certifications such as TOGAF, AWS Certified Solutions Architect.
Title: AI/ML Architect Location: Onsite Bangalore Experience: 10+ years Position Summary: We are seeking an experienced AI/ML Architect to lead the design and deployment of scalable AI solutions. This role requires a strong blend of technical depth, systems thinking, and leadership in machine learning , computer vision , and real-time analytics . You will drive the architecture for edge, on-prem, and cloud-based AI systems, integrating 3rd party data sources, sensor and vision data to enable predictive, prescriptive, and autonomous operations across industrial environments. Key Responsibilities: Architecture & Strategy Define the end-to-end architecture for AI/ML systems including time series forecasting , computer vision , and real-time classification . Design scalable ML pipelines (training, validation, deployment, retraining) using MLOps best practices. Architect hybrid deployment models supporting both cloud and edge inference for low-latency processing. Model Integration Guide the integration of ML models into the IIoT platform for real-time insights, alerting, and decision support. Support model fusion strategies combining disparate data sources, sensor streams with visual data (e.g., object detection + telemetry + 3rd party data ingestion). MLOps & Engineering Define and implement ML lifecycle tooling, including version control, CI/CD, experiment tracking, and drift detection. Ensure compliance, security, and auditability of deployed ML models. Collaboration & Leadership Collaborate with Data Scientists, ML Engineers, DevOps, Platform, and Product teams to align AI efforts with business goals. Mentor engineering and data teams in AI system design, optimization, and deployment strategies. Stay ahead of AI research and industrial best practices; evaluate and recommend emerging technologies (e.g., LLMs, vision transformers, foundation models). Must-Have Qualifications: Bachelor’s or Master’s degree in Computer Science, AI/ML, Engineering, or a related technical field. 8+ years of experience in AI/ML development, with 3+ years in architecting AI solutions at scale. Deep understanding of ML frameworks (TensorFlow, PyTorch), time series modeling, and computer vision. Proven experience with object detection, facial recognition, intrusion detection , and anomaly detection in video or sensor environments. Experience in MLOps (MLflow, TFX, Kubeflow, SageMaker, etc.) and model deployment on Kubernetes/Docker . Proficiency in edge AI (Jetson, Coral TPU, OpenVINO) and cloud platforms (AWS, Azure, GCP). Nice-to-Have Skills: Knowledge of stream processing (Kafka, Spark Streaming, Flink). Familiarity with OT systems and IIoT protocols (MQTT, OPC-UA). Understanding of regulatory and safety compliance in AI/vision for industrial settings. Experience with charts, dashboards, and integrating AI with front-end systems (e.g., alerts, maps, command center UIs). Role Impact: As AI/ML Architect, you will shape the intelligence layer of our IIoT platform — enabling smarter, safer, and more efficient industrial operations through AI. You will bridge research and real-world impact , ensuring our AI stack is scalable, explainable, and production-grade from day one.
Title: Principal Architect Location: Onsite Bangalore Experience: 15+ years in software & data platform architecture and technology strategy Role Overview We are seeking a Platform & Application Architect to lead the design and implementation of a next-generation, multi-domain data platform and its ecosystem of applications. In this strategic and hands-on role, you will define the overall architecture, select and evolve the technology stack, and establish best practices for governance, scalability, and performance. Your responsibilities will span across the full data lifecycle—ingestion, processing, storage, and analytics—while ensuring the platform is adaptable to diverse and evolving customer needs. This role requires close collaboration with product and business teams to translate strategy into actionable, high-impact platform & products. Key Responsibilities 1. Architecture & Strategy Design the end-to-end architecture for a On-prem / hybrid data platform (data lake/lakehouse, data warehouse, streaming, and analytics components). Define and document data blueprints, data domain models, and architectural standards. Lead build vs. buy evaluations for platform components and recommend best-fit tools and technologies. 2. Data Ingestion & Processing Architect batch and real-time ingestion pipelines using tools like Kafka, Apache NiFi, Flink, or Airbyte. Oversee scalable ETL/ELT processes and orchestrators (Airflow, dbt, Dagster). Support diverse data sources: IoT, operational databases, APIs, flat files, unstructured data. 3. Storage & Modeling Define strategies for data storage and partitioning (data lakes, warehouses, Delta Lake, Iceberg, or Hudi). Develop efficient data strategies for both OLAP and OLTP workloads. Guide schema evolution, data versioning, and performance tuning. 4. Governance, Security, and Compliance Establish data governance , cataloging , and lineage tracking frameworks. Implement access controls , encryption , and audit trails to ensure compliance with DPDPA, GDPR, HIPAA, etc. Promote standardization and best practices across business units. 5. Platform Engineering & DevOps Collaborate with infrastructure and DevOps teams to define CI/CD , monitoring , and DataOps pipelines. Ensure observability, reliability, and cost efficiency of the platform. Define SLAs, capacity planning, and disaster recovery plans. 6. Collaboration & Mentorship Work closely with data engineers, scientists, analysts, and product owners to align platform capabilities with business goals. Mentor teams on architecture principles, technology choices, and operational excellence. Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of experience in software engineering, including 5+ years in architectural leadership roles. Proven expertise in designing and scaling distributed systems, microservices, APIs, and event-driven architectures using Java, Python, or Node.js. Strong hands-on experience with building scalable data platforms on premise/Hybrid/cloud environments. Deep knowledge of modern data lake and warehouse technologies (e.g., Snowflake, BigQuery, Redshift) and table formats like Delta Lake or Iceberg. Familiarity with data mesh, data fabric, and lakehouse paradigms. Strong understanding of system reliability, observability, DevSecOps practices, and platform engineering principles. Demonstrated success in leading large-scale architectural initiatives across enterprise-grade or consumer-facing platforms. Excellent communication, documentation, and presentation skills, with the ability to simplify complex concepts and influence at executive levels. Certifications such as TOGAF or AWS Solutions Architect (Professional) and experience in regulated domains (e.g., finance, healthcare, aviation) are desirable.
Job Title: Data Analyst Location: Bangalore Experience: 6+ years Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Master’s degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 5+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.
Job Description Kafka/Integration Architect Position brief: Kafka/Integration Architect is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions. This role involves configuring, deploying, and monitoring Kafka clusters to ensure the high availability and scalability of data streaming services. The Kafka Architect collaborates with cross-functional teams to integrate Kafka into various applications and ensures optimal performance and reliability of the data infrastructure. Kafka/Integration Architect play a critical role in driving data-driven decision-making and enabling real-time analytics, contributing directly to the company’s agility, operational efficiency and ability to respond quickly to market changes. Their work supports key business initiatives by ensuring that data flows seamlessly across the organization, empowering teams with timely insights and enhancing the customer experience. Location: Hyderabad Primary Role & Responsibilities: Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing. Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth. Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency. Collaborate with development teams to integrate Kafka into applications and services. Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights. Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance. Optimize Kafka configurations for performance, reliability, and scalability. Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead. Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications. Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration. Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends. Required Soft Skills: Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to translate business requirements into technical solutions. Working Experience and Qualification: Education: Bachelor’s or master’s degree in computer science, Information Technology or related field. Experience: Proven experience of 8-10 years as a Kafka Architect or in a similar role. Skills: Strong knowledge of Kafka architecture, including brokers, topics, partitions and replicas. Experience with Kafka security, including SSL, SASL, and ACLs. Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments. Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink. Solid understanding of distributed systems, data streaming and messaging patterns. Proficiency in Java, Scala, or Python for Kafka-related development tasks. Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging. Experience with tools like Zookeeper, Schema Registry, and Kafka Connect. Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment. Experience with cloud platforms like AWS, Azure, or GCP. Preferred Skills: (Optional) Kafka certification or related credentials, such as: Confluent Certified Administrator for Apache Kafka (CCAAK) Cloudera Certified Administrator for Apache Kafka (CCA-131) AWS Certified Data Analytics – Specialty (with a focus on streaming data solutions) Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with other messaging systems like RabbitMQ or Apache Pulsar. Experience with data serialization formats like Avro, Protobuf, or JSON. Company Profile: WAISL is an ISO 9001:2015, ISO 20000-1:2018, ISO 22301:2019 certified, and CMMI Level 3 Appraised digital transformation partner for businesses across industries with a core focus on aviation and related adjacencies. We transform airports and relative ecosystems through digital interventions with a strong service excellence culture. As a leader in our chosen space, we deliver world-class services focused on airports and their related domains, enabled through outcome-focused next-gen digital/technology solutions. At present, WAISL is the primary technology solutions partner for Indira Gandhi International Airport, Delhi, Rajiv Gandhi International Airport, Hyderabad, Manohar International Airport, Goa, Kannur International Airport, Kerala, and Kuwait International Airport, and we expect to soon provide similar services for other airports in India and globally. WAISL, as a digital transformation partner, brings proven credibility in managing and servicing 135+Mn passengers, 80+ airlines, core integration, deployment, and real-time management experience of 2000+ applications vendor agnostically in highly complex technology converging ecosystems. This excellence in managed services delivered by WAISL has enabled its customer airports to be rated amongst the best-in-class service providers by Skytrax and ACI awards, and to win many innovation and excellence awards
Senior Java Developer Experience / Position / Location : 6 - 10 years of Experience / # of Positions 3 / Hyderabad Must Have Skills: Integration: Java, SpringBoot, and REST APIs, and Spring Cloud BackEnd: MongoDB / Oracle / MSSQL & MySQL / PostgreSQL Testing: JUnit & Exposure to Selenium DevOps: Git, Jenkins, Dockersnot Cloud & Infrastructure: AWS (ECS, EKS, Lambda, S3) Containerization & Orchestration: Docker, Kubernetes Messaging & Streaming: Kafka, MQTT Monitoring & Logging: Grafana, Datadog, Elasticsearch Position Brief Problem-solving and critical thinking skills with a focus on building scalable, reliable, and high-performance systems. Strong verbal and written communication skills to explain complex technical concepts to non-technical stakeholders. Collaborative team player with the ability to lead technical discussions and provide direction. Deep understanding and experience working in Scrum or Kanban agile frameworks. Confluence for project tracking and documentation. Strong understanding of agile principles and tools like Jira Roles and Responsibilities Team Management: Lead and mentor junior developers, fostering a culture of knowledge sharing and technical growth within the team. Requirement Management: Communicate effectively with team members and stakeholders to understand requirements and deliver solutions. Collaborate with cross-functional teams, including product managers, UX/UI designers, and DevOps engineers, to define, design, and deliver new features in a fast-paced environment. Guide the Business Analysts and Testing teams for Use Case & Test Case development Development: Engage actively in agile development processes, including sprint planning, stand-ups, and retrospectives. Design, Develop, and Maintain web applications within the WAISL Application ecosystem. Build and enhance backend services for application integration and efficient data management and real-time processing. Participate in code reviews to ensure high-quality code and adherence to best practices. Testing: Ensure that Application is testing across Unit, Integration, Regression, VAPT & UAT stages Education : Bachelors degree in computer science, Engineering, or a related field (or equivalent experience) Other (Good to Have): Domain: Airport / Airline / Aviation experience FrontEnd: AngularJS, Integration: NodeJS, Kafka / IBM Queue Manager, Python BackEnd: MongoDB, Cassandra, and ElasticSearch, RDS, GraphL, and JSON Web Tokens (JWT)
Location: Bangalore Onsite Experience: 12+ years Type: Full-time --- Role Overview We are looking for a Technical Program Manager (TPM) to drive the execution of a next-generation data and AI platform that powers real-time analytics, machine learning, and industrial applications across multiple domains such as aviation, logistics, and manufacturing. You will work at the intersection of engineering, product, architecture, and business, managing the roadmap, resolving technical dependencies, and ensuring delivery of critical platform components across cross-functional and geographically distributed teams. --- Key Responsibilities Program & Execution Management Drive end-to-end delivery of platform features and sector-specific solutions by coordinating multiple scrum teams (AI/ML, Data, Fullstack, DevOps). Develop and maintain technical delivery plans, sprint milestones, and program-wide timelines. Identify and resolve cross-team dependencies, risks, and technical bottlenecks. Technical Fluency & Architecture Alignment Understand the platform’s architecture (Kafka, Spark, data lakes, ML pipelines, hybrid/on-prem deployments) and guide teams toward cohesive delivery. Translate high-level product goals into detailed technical milestones and backlog items in collaboration with Product Owners and Architects. Cross-Functional Collaboration Liaise between globally distributed engineering teams, product owners, architects, and domain stakeholders to align on priorities and timelines. Coordinate multi-sector requirements and build scalable components that serve as blueprints across industries (aviation, logistics, etc.). Governance & Reporting Maintain clear, concise, and timely program reporting (dashboards, OKRs, status updates) for leadership and stakeholders. Champion delivery best practices, quality assurance, and documentation hygiene. Innovation & Agility Support iterative product development with flexibility to handle ambiguity and evolving priorities. Enable POCs and rapid prototyping efforts while planning for scalable production transitions. ---Required Skills & Qualifications 12+ years of experience in software engineering and technical program/project management. Strong understanding of platform/data architecture, including event streaming (Kafka), batch/stream processing (Spark, Flink), and AI/ML pipelines. Proven success delivering complex programs in agile environments with multiple engineering teams. Familiarity with DevOps, cloud/on-prem infrastructure (AWS, Azure, hybrid models), CI/CD, and observability practices. Excellent communication, stakeholder management, and risk mitigation skills. Strong grasp of Agile/Scrum or SAFe methodologies. --- Good-to-Have Experience working in or delivering solutions to industrial sectors such as aviation, manufacturing, logistics, or utilities. Experience with tools like Jira, Confluence, Notion, Asana, or similar. Background in engineering or data (Computer Science, Data Engineering, AI/ML, or related).
Role Overview We are seeking a highly skilled and creative Senior UI Developer with deep experience building modern, data-driven dashboards and interactive interfaces using Angular or React , and should be comfortable working with APIs, WebSockets , and CSS/SCSS to deliver intuitive, performant user experiences. You will collaborate closely with product managers, UX designers, and back-end developers to create seamless and scalable interfaces that bring complex data to life. Key Responsibilities Develop rich, interactive, and responsive UI components using Angular or React. Build and maintain dashboards that visualize complex data in a user-friendly manner. Integrate UI components with RESTful APIs and WebSockets for real-time data handling. Write clean, maintainable styles using CSS, SCSS, and adhere to UI/UX best practices. Optimize front-end performance for speed and scalability. Collaborate with UX designers to implement user-friendly designs and maintain visual consistency. Participate in code reviews and contribute to front-end architecture decisions. Ensure cross-browser compatibility and mobile responsiveness. Must-Have Skills 7+ years of hands-on experience in front-end development. Strong experience with Angular or React (at least one framework in depth). Proven experience in creating complex dashboards and visualizations. Experience integrating with RESTful APIs and handling WebSockets for real-time updates. Solid understanding of CSS, SCSS , and responsive design principles. Good-to-Have Skills Familiarity with charting and visualization libraries such as D3.js , Chart.js , Three.js , or WebGL . Experience with UI deployment and hosting using Nginx, Docker, or Kubernetes . Exposure to map integration using tools like Leaflet, Mapbox, or Google Maps APIs . Understanding of UX principles , usability heuristics, and design-system implementation. Qualifications Bachelors or Master’s degree in Computer Science, Design, or a related field. 7+ years of UI/front-end development experience in modern JavaScript frameworks. Portfolio or demos of dashboards or interactive UI components you’ve built is a plus. Excellent communication and collaboration skills.
Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Master’s degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 6+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.
Experience in web development using Javascript, AngularJs, Angular. Experience in backend development using Java, Spring, REST. Experience in integration with MongoDB, ElasticSearch and Kafka. Experience in computer networking projects would be a plus. Strong analytical and problem solving skills. Strong verbal and written communication skills and demonstrated technical leadership. Enjoys working as part of agile software teams in a start-up environment. Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *
Experience in development using Java, Spring, Springboot. Experience in creating application using REST Services. Experience in integration with MongoDB, ElasticSearch and Kafka. Experience in web development using HTML, CSS, JavaScript, jQuery. Experience in web development using AngularJs, Angular would be a plus. Experience in computer networking projects would be a plus. Strong analytical and problem solving skills. Strong verbal and written communication skills. Enjoys working as part of agile software teams in a start-up environment. Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *
Installing and integrating EPOS for all concessionaires by assessing business types and coordinating with internal & external teams. Address L2 incident management issues for EPOS applications. Revenue leakage control through CCTV monitoring & sales data reconciliation for billing. Understand the business model of the concessionaire Provide solution based on their business requirements. Coordinate with internal and external teams to fulfil the new requirements of the concessionaire ensuring the benefit of existing users and business growth of the organization. Software installation and configuration based on the inputs of the respective teams in the stipulated time. Understand the data source of POS in Non-Native users. Suggesting the data pull mechanism complying the policies of GHIAL IT Collecting the inputs of data fields and developing the tool accordingly in coordination with vendor. Observe the data flow and validation, for a significant period post installation of tool. Declaring the outlet live post successful completion of the observation period. Resolving issues in coordination with vendor within the SLA. Ensuring the data availability for all concessionaires. Validating sales on a daily and monthly basis to ensure no discrepancy in coordination with BD. Submitting interface file to finance for billing. Sourcing MIS reports and sharing discrepancy reports to commercial team. CCTV monitoring for suspicious transactions leading to revenue leakage and reporting to commercial team. Finding void and discount transaction through EPOS reports and identify through CCTV recordings and reporting to commercial teams. Develop and maintain storage policies, procedures, and documentation to ensure efficient storage management and data protection. Maintaining EPOS cameras and installing in all stores for RLC. Maintaining and issuing boarding pass scanners for all retailers to scan boarding pass for every transaction. Documenting and circulating to users on all new features included in EPOS. Sign off for new concessionaires. KEY ACCOUNTABILITIES Address user tickets Troubleshoot the issues. preventive maintenance Pro-active check of applications Record technical issues and solutions in logs. Direct unresolved issues to the next level of support personnel Help create technical documentation and manuals. Ensure data is available is up to date. SKILLS AND KNOWLEDGE Experience in supporting various applications in an on-site environment. Experience working directly in a customer service environment. Excellent interpersonal skills are essential. Basic functional knowledge of retail and F&B applications. Strong work ethics as well as excellent oral and written communication skills Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *
Join us in revolutionizing airport ecosystems and shaping the future of aviation Job Category: Experience: Job Location: Positions: Operational Excellence Maintaining wireless services without any issues (Login page, OTP, MAC etc .) and attending ASQ surveys Support International Passengers for provisioning coupons and maintain personal Identification data securely Basic Knowledge on Cisco wireless controller monitoring. Calls to be tracked / updated / closed in ticketing tool Ensure Uptime, Availability and Performance of the individual components of the Wi-Fi Infrastructure and the overall service. Monitoring the Access Point Performance Periodically and resolve all the issues as per SLA in coordination with NOC and other internal Teams Cordinating with service provider for resolving ISP related issues. Attending L1 network calls by coordinating with network team Periodic monitoring of the infrastructure and WLC to identify the rogue access points/devices and recommend the action plan to trace and eliminate the Rouge APs Resolution of calls to be done as per Passenger satisfaction. Maximize the usage of tool monitoring to drive productivity improvement in operations and sharing reports. Learning skills-On new Technology and networks Latest Network architecture and devices Basic network troubleshooting on networks/Wi-Fi Educational Qualifications Graduation/ Any Degree / Three Years Diploma Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume * Upload CV/Resume * Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *
Title: AI/ML Architect Location: Onsite Bangalore Experience: 10+ years Position Summary: We are seeking an experienced AI/ML Architect to lead the design and deployment of scalable AI solutions. This role requires a strong blend of technical depth, systems thinking, and leadership in machine learning , computer vision , and real-time analytics . You will drive the architecture for edge, on-prem, and cloud-based AI systems, integrating 3rd party data sources, sensor and vision data to enable predictive, prescriptive, and autonomous operations across industrial environments. Key Responsibilities: Architecture & Strategy Define the end-to-end architecture for AI/ML systems including time series forecasting , computer vision , and real-time classification . Design scalable ML pipelines (training, validation, deployment, retraining) using MLOps best practices. Architect hybrid deployment models supporting both cloud and edge inference for low-latency processing. Model Integration Guide the integration of ML models into the IIoT platform for real-time insights, alerting, and decision support. Support model fusion strategies combining disparate data sources, sensor streams with visual data (e.g., object detection + telemetry + 3rd party data ingestion). MLOps & Engineering Define and implement ML lifecycle tooling, including version control, CI/CD, experiment tracking, and drift detection. Ensure compliance, security, and auditability of deployed ML models. Collaboration & Leadership Collaborate with Data Scientists, ML Engineers, DevOps, Platform, and Product teams to align AI efforts with business goals. Mentor engineering and data teams in AI system design, optimization, and deployment strategies. Stay ahead of AI research and industrial best practices; evaluate and recommend emerging technologies (e.g., LLMs, vision transformers, foundation models). Must-Have Qualifications: Bachelor’s or Master’s degree in Computer Science, AI/ML, Engineering, or a related technical field. 8+ years of experience in AI/ML development, with 3+ years in architecting AI solutions at scale. Deep understanding of ML frameworks (TensorFlow, PyTorch), time series modeling, and computer vision. Proven experience with object detection, facial recognition, intrusion detection , and anomaly detection in video or sensor environments. Experience in MLOps (MLflow, TFX, Kubeflow, SageMaker, etc.) and model deployment on Kubernetes/Docker . Proficiency in edge AI (Jetson, Coral TPU, OpenVINO) and cloud platforms (AWS, Azure, GCP). Nice-to-Have Skills: Knowledge of stream processing (Kafka, Spark Streaming, Flink). Familiarity with OT systems and IIoT protocols (MQTT, OPC-UA). Understanding of regulatory and safety compliance in AI/vision for industrial settings. Experience with charts, dashboards, and integrating AI with front-end systems (e.g., alerts, maps, command center UIs). Role Impact: As AI/ML Architect, you will shape the intelligence layer of our IIoT platform — enabling smarter, safer, and more efficient industrial operations through AI. You will bridge research and real-world impact , ensuring our AI stack is scalable, explainable, and production-grade from day one.
We are seeking an experienced Key Account Manager preferably with a strong background in the Airport / Aviation domain to nurture strategic client relationships and drive long-term business success. This role requires a proactive, customer-centric leader who can align client needs with organizational capabilities to deliver high-value solutions. Work Location: New Delhi / Hyderabad Role & Responsibilities: Contribute to sustaining and growing our business to achieve long-term success. Acquiring a thorough understanding of key business needs and technical requirements from time to time. Expanding the relationships with existing customers by continuously proposing solutions that meet their objectives Customer Advocacy + Voice of WAISL; Acts as a Value creator and a trusted partner. Operating as the SPOC between various functions and customer. Liaison with all the business units within the organization to deliver value customers. Managing contracts, invoicing, billings, AMC , receivables with the customers. Resolving conflicts and managing customer relationship in a seamless manners improving trust within the ecosystem. Play an integral part in winning deals and creating new opportunities to win within the portfolio. Be integral part of defined governance and driving it across teams. Timely communication and status reporting to stakeholders. Working Experience and Qualification 15+ years of work experience with proven experience as key account manager in the service industry. Experience in Customer Service, Sales and providing Solutions based on customer needs. Ideally a Postgraduate in business administration, sales or relevant field. Strong communication and interpersonal skills with aptitude in building relationships with professionals of all organizational levels. Excellent organizational skills, strong negotiation and leadership skills. Ability in problem-solving and negotiation
Title: AI/ML Architect Location: Onsite Bangalore Experience: 10+ years Position Summary: We are seeking an experienced AI/ML Architect to lead the design and deployment of scalable AI solutions. This role requires a strong blend of technical depth, systems thinking, and leadership in machine learning , computer vision , and real-time analytics . You will drive the architecture for edge, on-prem, and cloud-based AI systems, integrating 3rd party data sources, sensor and vision data to enable predictive, prescriptive, and autonomous operations across industrial environments. Key Responsibilities: Architecture & Strategy Define the end-to-end architecture for AI/ML systems including time series forecasting , computer vision , and real-time classification . Design scalable ML pipelines (training, validation, deployment, retraining) using MLOps best practices. Architect hybrid deployment models supporting both cloud and edge inference for low-latency processing. Model Integration Guide the integration of ML models into the IIoT platform for real-time insights, alerting, and decision support. Support model fusion strategies combining disparate data sources, sensor streams with visual data (e.g., object detection + telemetry + 3rd party data ingestion). MLOps & Engineering Define and implement ML lifecycle tooling, including version control, CI/CD, experiment tracking, and drift detection. Ensure compliance, security, and auditability of deployed ML models. Collaboration & Leadership Collaborate with Data Scientists, ML Engineers, DevOps, Platform, and Product teams to align AI efforts with business goals. Mentor engineering and data teams in AI system design, optimization, and deployment strategies. Stay ahead of AI research and industrial best practices; evaluate and recommend emerging technologies (e.g., LLMs, vision transformers, foundation models). Must-Have Qualifications: Bachelor’s or Master’s degree in Computer Science, AI/ML, Engineering, or a related technical field. 8+ years of experience in AI/ML development, with 3+ years in architecting AI solutions at scale. Deep understanding of ML frameworks (TensorFlow, PyTorch), time series modeling, and computer vision. Proven experience with object detection, facial recognition, intrusion detection , and anomaly detection in video or sensor environments. Experience in MLOps (MLflow, TFX, Kubeflow, SageMaker, etc.) and model deployment on Kubernetes/Docker . Proficiency in edge AI (Jetson, Coral TPU, OpenVINO) and cloud platforms (AWS, Azure, GCP). Nice-to-Have Skills: Knowledge of stream processing (Kafka, Spark Streaming, Flink). Familiarity with OT systems and IIoT protocols (MQTT, OPC-UA). Understanding of regulatory and safety compliance in AI/vision for industrial settings. Experience with charts, dashboards, and integrating AI with front-end systems (e.g., alerts, maps, command center UIs). Role Impact: As AI/ML Architect, you will shape the intelligence layer of our IIoT platform — enabling smarter, safer, and more efficient industrial operations through AI. You will bridge research and real-world impact , ensuring our AI stack is scalable, explainable, and production-grade from day one. ]