The Horizon 2020 funded HAAWAII project develops a reliable, error resilient and adaptable solution to automatically transcribe voice commands from air traffic controllers (ATCO) and pilots.
Using machine learning, the project builds on very large collections of speech data, organized with a minimum expert effort, to develop a new set of speech recognition models for the complex ATM environments of the London terminal area (TMA) and Icelandic enroute airspace. Speech and surveillance data recordings from real-life pilot-controller communications, i.e., directly from the operations rooms, are used.
HAAWAII aims to significantly enhance ATM safety and reduce ATCOs workload. The digitization of controller-pilot-communication can be used for a wide variety of safety and performance related ATM improvements. Proof-of-concept applications are readback-error-detection, callsign-highlighting and ATCO-workload-estimation.
In the short clip below you see the HAAWAII prototype in action with automatic radar label maintenance, early callsign highlighting, immediate online recognition and readback-error-detection: