Skip to main content
European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Vision-based Guidance and Control in Birds, with Applications to Autonomous Unmanned Aircraft

Periodic Reporting for period 4 - HawkEye (Vision-based Guidance and Control in Birds, with Applications to Autonomous Unmanned Aircraft)

Período documentado: 2021-02-01 hasta 2022-10-31

Birds have been described as “a wing guided by an eye”, but before this project we knew surprisingly little of how birds use vision in flight. Current autonomous air vehicles, or drones, use simple visual sensors to detect drift when hovering and computationally intensive algorithms for visual mapping. Neither approach closely reflects how birds use vision to guide their flight, so understanding this mathematically has the potential to unlock new capabilities for autonomous systems.

Our primary objective was to identify the guidance laws used by birds in pursuit, perching, obstacle avoidance, and gap negotiation. A guidance law is a mathematical relationship describing how sensor information is used to command steering, so using this approach to summarise a bird’s behaviour makes our results ready for implementation in air vehicles. We found that falcons intercept targets using the same guidance law as missiles, but that hawks use a different guidance law to enable collision avoidance when pursuing targets through clutter. As proof-of-concept, we demonstrated the same algorithms on small quadrotor drones.

Birds use active vision, which means they direct their gaze to enhance the visual information they obtain. Our second objective was therefore to identify how birds use head and eye movements to collect visual information during goal-directed flight behaviours. We found that birds look at objects differently according to whether they are targets or obstacles, tracking the centre of a perch but the edge of an obstacle. Many autonomous systems also use active vision, so our results from birds are ready for implementation in vehicles, and have also helped us design patterns modifying the appearance of wind turbine blades to prevent bird collisions.

Our third objective was to understand how birds transform steering commands into actions. Birds morph their wings and tail to control flight, so modelling this behaviour requires new approaches to describing flight control. We found that birds control their flight to minimize the distance from a perch at which they stall, offering a new approach to optimizing perching in drones using machine learning. We also identified a set of control inputs describing the coupled motions of the wings and tail, and created a morphing-wing flight simulator capable of simulating goal-directed morphing-wing flight in birds and air vehicles.

Our results provide new biological insight and inspiration for future robotic systems.
Delivering these objectives required us to develop many new experimental techniques, including developing: (i) a state-of-the-art motion capture facility for indoor flight research, with robotic perches, obstacles, and targets; (ii) miniature onboard sensors for outdoor flight research, including lightweight GPS devices with 2 cm accuracy; and (iii) techniques for 3D video reconstruction in the field. In total, we collected quantitative data on over 20,000 flights.

We showed that falcons use the same guidance law as missiles to intercept prey, but that hawks use a “new” guidance law suited to tail-chasing prey through clutter. We then showed that falcons modify their guidance at different stages of an attack when chasing evasive prey, using computer simulation to evolve attack and evasion strategies. This modelling showed that falcons dive after their prey because this allows them to sustain higher loads for manoeuvring. We then showed how the guidance law that hawks use could be implemented visually by measuring the motion of a target against the visual background. Finally, we showed that hawks target fixed points when attacking swarms, which allows them to identify targets without confusion because targets on a collision course appear stationary against a distant background.

We found that the same guidance laws could also be used to model obstacle avoidance, and found that pigeons and zebra finches make use of brightness cues for gap negotiation. We then extended our analyses by using video rendering to visualize what our birds saw as the avoided obstacles. This allowed us to show that birds look at the edges of obstacles, but at the centres of perches. Finally, we combined our studies of pursuit and obstacle avoidance, by putting the two behaviours in conflict. We used this to identify how hawks avoid obstacles during pursuit, using a simple modification of the guidance law that we had previously found.

We challenged our birds to land on moving perches, and found that hawks swoop up to a perch in a way that makes them better able to control their flight in the critical moments before landing. We analysed how they morphed their wings and tail during manoeuvres, identifying a set of control inputs that can be used to describe morphing-wing flight control in birds and air vehicles. These results have applications not only in the design of new morphing-wing air vehicles, but also in the use of machine learning to accomplish tasks like perching.
This project has brought the study of behavioural biomechanics into the era of big data, enabling the collection of data on a scale not possible previously. Tracking data from over 20,000 flights has enabled us to analyse the dynamics of bird flight in unprecedented detail, over many individuals, and over multiple timescales, allowing us to validate the repeatability of our findings. Unexpectedly, it has even allowed us to analyse how birds learn to improve their flight behaviour over time, offering detailed insight into the optimization of flight behaviour and opening an entire new avenue of research on learning to fly.

The execution of the project coincided with a period of extremely rapid growth in computational techniques, including deep learning which we applied together with other machine learning techniques to model aspects of our data. It also coincided with the rapid development of video rendering engines by the film and gaming industries, which we used to visualize what our birds were seeing in flight. This in turn was made possible by combining state-of-the-art motion capture technologies measuring the gaze direction of the birds with cutting-edge 3D reconstruction tools modelling the interior of the lab. This has again inspired another new avenue of planned research, using the same tools to visualize how birds see the built environment.

The main application of our research that we foresaw at the start of the project was in designing guidance and control algorithms for autonomous vehicles. Whilst our results from birds have indeed been used to design new guidance and control algorithms as envisioned, an unforeseen application of our experiments with moving obstacle has been to inspire new methods of avoiding bird collisions with wind turbine blades. This involves marking turbine blades in ways that are specifically inspired by what we learnt on this project about how birds direct their gaze and avoid obstacles in flight. We expect to begin testing the first blade patterns that we have designed later this year.

Besides its applications in inspiring new guidance and control algorithms for autonomous vehicles, and their potential to mitigate bird collisions with wind turbines, our research has shed new light on how birds use vision to guide and control their flight. This has proven a remarkably tractable problem to study, and the results provide some of the closest quantitative models of complex natural behaviours achieved to date.
"Rhaegal": one of our flying team of Harris' Hawks emerging from a moving gap in the Flight Lab.