Periodic Reporting for period 3 - PercEvite (PercEvite - Sense and avoid technology for small drones)
Periodo di rendicontazione: 2019-09-01 al 2020-08-31
The overall objective of PercEvite is to develop a sensor, communication, and processing suite for small drones. It will enable detecting and avoiding ground-based obstacles and flying air vehicles without necessitating human intervention. To avoid ground-based obstacles, we aim for a lightweight, energy-efficient sensor and processing package that maximizes payload capacity.
The PercEvite consortium consists of two academic and one commercial partner: TU Delft, KU Leuven, and Parrot SA. In the course of three years (starting from September 1, 2017), these partners want to show the potential of a light-weight sense-and-avoid package (in the order of 200 grams), based on frontrunner technologies and extensive real-world tests.
PercEvite has developed a sensor, communication, and processing suite for small drones, enabling detect-and-avoid of ground-based obstacles and flying air vehicles without necessitating human intervention. In the project, we have made two such suites, a mini suite (~150 grams) and a micro suite (~50 grams). Both suites include the capability to avoid ground-based obstacles and perform cooperative avoidance via WiFi. LoRa, and LTE. The mini suite additionally has ADSB-in for avoiding general aviation aircraft equipped with ADS-B. While the mini suite is completely based on commercially of-the-shelf products, the micro suite includes a stereo vision system that was custom-designed in PercEvite. We will release the schematics of this stereo vision system under an open hardware license.
Besides the creation of the PercEvite mini and micro suite, we have also developed various algorithms for achieving avoidance of ground-based obstacles, communication-based cooperative avoidance, and different manners of non-cooperative avoidance. Many of these algorithms have been successfully tested in real-world environments.
Our investigation has led to the following four conclusions. First and foremost, it is possible to create very light-weight suites for staying well clear of both static obstacles and other flying air vehicles, requiring minimal adjustments to current hardware and software used by drone producers. Second, communication of position and velocity between different flying air vehicles is very mature and can be implemented with little effort at a very high gain. Third, ground-based obstacle avoidance is also rather mature, although limitations (flying in the dark, fog) and edge cases (reflections, transparent surfaces, etc.) exist. These limitations and edge cases can be tackled though by additional sensors if necessary. Fourth, although we made important steps towards non-cooperative sense and avoid, it is the least mature technology. In PercEvite we have observed that it mostly suffers from the availability of data sets for (1) benchmarking performance, and (2) machine learning.
Our main recommendation to advance sense-and-avoid technology is for SESAR to set up a call for the creation of big data sets for non-cooperative sense-and-avoid, with the aim of making the data open access for use by the community. The call should in our opinion be open to different data types, as explored in PercEvite (audio, visual, radio frequency (RF), etc.).
Impact will be reached by the software and hardware developed in the project itself, as these are all open access. However, we think the main impact will come from what we have shown in the project: How widely available light-weight technologies can mean a game-changer for drone safety. Finally, we have studied a few topics that are far beyond the state of the art in the market, such as a study on hear-and-avoid of other air traffic. Although this has not led to a working system yet, the insights that have been published in scientific articles and the uniqure data set that we have gathered and made open access will contribute to the further development of this important element of sense and avoid of other air traffic.