Periodic Reporting for period 2 - ANGI-HUD (Next Generation Cockpit HUD Integration)
Reporting period: 2018-03-01 to 2019-08-31
The importance to society:
Low cost and safe flight is an essential part of the modern world. Increased safety and cheaper flight costs is expected to contribute to:
1) The new HUD architecture will reduce the number of displays in the head down displays by becoming a PFD sole means. It will allow to reduce hardware use and weight onboard or to allow more display areas for new applications display and availability to the Pilots.
2) Wearable HUD is a replacement for heavy and complex installation fixed HUD and will save costly downtime for a retrofitted aircrafts.
3) Display interactive and intuitive display including 3d synthetic entities and an intuitive control over the avionics will decrease pilot’s workload while increasing situational awareness.
4) Reduce deviations in all landing conditions and by that, reduces hard landings.
Objective:
1) The main objective of this ANGI-HUD project is to analyse how the capacities of the Head Up Display (HUD) could be used to provide new functionalities, in combination with other visualization means, and to demonstrate them on a fixed-based simulator.
2) Contribute to the analysis of potential new functionalities, prototype the intended new Man Machine Interface (MMI),
3) Provide the Airframer with two representative HUD systems, including rapid prototyping capacities, and participate to, and support, bench tests at the Airframer’s simulation facilities.
4) Assess and analyse how novel HUD intelligent functionalities can be fully integrated into this next generation cockpit concept such that the efficiency of the new researched technologies and concepts are maximized. Furthermore, optimum performance of the cockpit operations in general is aiming at implicitly contributing to the Clean Sky 2 overall objectives in a best possible way.
2) Prototype the intended new Man Machine Interface: To achieve this objective the consortium members have developed tools and demonstrators to demonstrate and evaluate the ‘eyes out’ concept. A ‘simple tool’ was developed that allows presenting the pilot with a set of avionics data on a screen emulating the HUD and an ‘eye tracking’ device to receive selections from the pilot. Several experiments were conducted to choose the HW and the input methods that are most suitable for that tool. The main aim of the ‘simple tool’ is to verify the benefits of the ‘eye tracking’ and the work load related to ‘eye tracking’. A second demonstrator that was developed is the ‘interactive SVS’.
3) Provide the Airframer with two representative HUD systems: Two HUD systems and the necessary HW and SW were delivered and installed at the Bizjet simulator.
4) Participate and support bench tests at the Airframer’s simulation facilities: The ANGI-HUD consortium is supporting the Airframer with all activities related to HUD installation in the Bizjet simulator, the development of rapid prototyping. Also, the consortium supports the Airframer with the ability of self rapid prototyping.
Main results achieved:
1) Prototypes:
2 HUD systems in the airframer simulator.
4 rapid prototype versions for the HUD SW as per airframer specifications.
2) ‘Eyes out’ concept demonstrators (for Testing Activities)
3) Experiments results and conclusions of the ‘Eyes out', interactive SVS and interactive HUD concepts involving all MMI related topics and other future cockpit concepts.
4) Future HUD architucure including test bench to demonstrate HUD as PFD sole means
Results:
1. MMI interface experiment of 'Eyes out' concept
The experiments were focused on testing the 'Eyes out' concept based on the future HUD architeture. The last experiment used all the data which was gathered from the other experiments in order to test if the 'Eyes out' concept is reducing pilots workload and increasing there situational awarenes and by that, meeting project objectives.
The 'Eyes out' concept final experiment was tested with 10 european airlines Pilots. The Pilots performed the tasks as described in the scenarios below (1.1-1.3). Both with and without the 'Eyes out' concept.
1.1 Speed, Heading and Altitude changes
1.2 Late Runway Change
1.3 Go Around, flap and gear adjustments
Potential exploitation of the results for the consortium would be the use of HUD as a sole mean PFD and development of MMI for increased operational benefits in the future cockpit. In the future, the HUD and eye gaze interaction may be an enabler for single pilot operations and may enable the replacement of existing displays and controls as we see in current day cockpits.
dissemination-
The ANGI-HUD project was poblished in several websits:
1. Elbit web site - https://elbitsystems.com/products/comercial-aviation/innovation-rd/
2. NLR - https://youtu.be/HU7jluJyq-s
https://www.linkedin.com/feed/update/urn:li:activity:6573160782056439809
https://twitter.com/NLR_NL/status/1167396664296038401
The new HUD architecture will reduce the number of displays in the head down displays by becoming a PFD sole means.
It will allow to reduce hardware use and weight onboard or to allow more display areas for new applications display and availability to the Pilots.
Wearable HUD is a replacement for heavy and complex installation fixed HUD and will save costly downtime for a retrofitted aircrafts.
Display interactive and intuitive display including 3d synthetic entities and an intuitive control over the avionics will decrease pilot’s workload while increasing situational awareness.
By achieving the above, future HUD can support:
* Reduce 2 man cockpits into single pilot operation.
* Increase situational awareness to traffic and navigation in a growing and congested aviation environment.
Reduce deviations in all landing conditions and by that, reduces hard landings.