Skip to main content
European Commission logo
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Contenuto archiviato il 2024-05-24

Real Time COordination and control of Multiple heterogeneous unmanned aerial vehiclES

CORDIS fornisce collegamenti ai risultati finali pubblici e alle pubblicazioni dei progetti ORIZZONTE.

I link ai risultati e alle pubblicazioni dei progetti del 7° PQ, così come i link ad alcuni tipi di risultati specifici come dataset e software, sono recuperati dinamicamente da .OpenAIRE .

Risultati finali

A distributed, master-less communication system that can operate on wired and wireless links has been developed. It is able to do a simple peer-to-peer connection but can as well interconnect a fleet of (micro) UAVs. Communication is realized using a distributed shared memory approach. The coherency is kept by a real-time aware protocol that uses traffic shaping and dynamic routing. The communication system has been developed for a special purpose: Communication is essential for unmanned aerial systems (UAVs). Micro UAVs are especially requiring highly efficient communications systems since their capabilities in payload, energy and processing power are very limited. Still, the communications system (CS) should be very flexible in terms of interconnectivity between heterogeneous systems and in dynamic environments. The communication system has been developed, implemented and tested during the project “Real-time coordination and control of multiple heterogeneous unmanned aerial vehicles” (COMETS). During this project, a variable number of unmanned aerial vehicles (UAVs) had to exchange information. These UAVs build the flying segment (FS). On the ground, the fleet is supported by a number of computers. They build the ground segment (GS). Since vital data is transported via the communication links in and between the FS and the GS, the CS had to meet real-time constraints and provide bandwidth assignment to allow prioritization of important versus unimportant data. The network topology especially of the FS may not be fixed, which imposes the need for a dynamic rerouting mechanism. To gain redundancy, support for communication over multiple physical links was included. Because of the heterogeneous system setup, the CS had to be highly portable to different operating systems and light enough to run on a 16-bit microcontroller.
COMETS is a system for the real-time coordination of multiple heterogeneous UAVs that integrates several components using the COMETS architecture. The COMETS components are: - A communication system, which provides the transport layer that, supports the communications between the different components of the COMETS system - A control centre with mission planning and mission monitoring and control functionalities - A perception system, which integrates the images and sensor data from the UAVs to provide perception capabilities. - A flying segment composed of multiple heterogeneous UAVs each one with its Onboard Proprietary Components, which are specific to the UAV, a supervisor that interfaces the UAV with the other COMETS sub-systems and enables the real-time coordination of the UAV, and a deliberative layer when the UAV has autonomous capacities.
An approach to build a high-resolution 3D map of an environment on the sole basis of stereovision imagery on-board a UAV or a UGV has been developed. The approach relies on a classic Simultaneous Localisation and Mapping approach, and can be run on-line, as the vehicle gathers data.
The system integrates hardware and software components for fire monitoring by using a UAV flying at low altitudes. It has been designed to provide in real-time information useful for fire fighting. Thus, the system is able to provide measures such as the fire front position, its dynamic evolution and an estimation of the height of the flames. The hardware components consist of visual and infrared cameras, and a housing with the electronic components including a video server, sensors for vehicle positioning (GPS and low cost IMU) and a communication link. The software components implement computer vision techniques to extract important fire parameters from the visual and infrared images. The software can run on a conventional laptop with a wireless link. The positioning data provided by the sensors are used to geolocalize the points of the fire front and to estimate the height of the flames from the results obtained from the images.
The teleoperation system has been specially designed to aid pilots of remotely piloted helicopters. The system facilitates the pilot work in low visibility conditions and makes possible the cooperation with other autonomous and remotely piloted vehicles. The system integrates both hardware and software components to perform its functions. The hardware is composed of both on-board and on-ground components. The helicopter carries on a collection of devices for status monitoring, data processing, data link with the ground station and application oriented devices such as cameras and sensors. The ground segment comprises the pilot special screen and a notebook for data visualization. The main functions of the teleoperation software are the communications with the helicopter and the implementation of a Human Machine Interface (HMI).
Organisation concepts of the on-board decisional autonomy functionalities for UAV: - Mission execution control concept - Interaction paradigm and algorithms (mission planning, task allocation, supervision) Will be used for further investigation on: - Multi-UAV systems - Air/ground robot cooperation
An autonomous flying helicopter system has been developed. The system is able start, land and fly a model helicopter of the 2 m class in a three-dimensional navigational space as long as it comes with the necessary sensor equipment. The development focused on the hardware prototype development, sensor fusion and control algorithms. One goal was to implement the system on a single embedded microcontroller while being able to use different kinds of sensors.
A software system for cooperative detection and localisation by means of a fleet of heterogeneous UAVs. The software system consists of a distributed segment and a centralized segment. The distributed segment is integrated by a process attached to each source of data (each UAV), and is a plug-in for the detection and localization system. It can be placed on-board the vehicles of enough processing power or on ground processing units. The centralized system is able to combine the information provided by the different vehicles to improve the detection and positioning capabilities of the system The system can use cameras and other presence sensors. Different algorithms can be included in the system to detect objects of interest. These objects are geolocalised using the data provided by the positioning sensors on-board the UAVs. Several techniques are used to reduce the false alarm ratio in detection and increase the accuracy in localisation by means of the fusion of the information provided by the different UAVs.
An airship prototype has been developed. The system is propelled by electric motors, and is fully equipped with sensors, actuators and on-board processing units to be either remotely or autonomously controlled. The overall mass at taking-off is 24kg, including 4kg of payload (it can therefore be operated in France without any specific authorization), its length is 9.5m and its diameter is 2.0m.
The system can compensate in real time pure rotational motion or arbitrary motion over a quasi-planar surface. Images are transformed to match an initial reference frame; background apparent motion is essentially cancelled, while independent motion (vehicles, flames, smoke) is preserved. The system can handle high and erratic apparent motion, and can work in non-structured, natural environments. The following steps are performed: - Image matching. - Homography matrix computation with outlier rejection. - Optimized image warping.
Generic and scalable architecture for Control Centre software, which allows: - The breakdown of a user-defined abstract mission plan in sets of atomic procedures directly executable by the UAVs. - The supervision of the mission execution to find conditions that may require corrective actions to increase safety and success probability. - Direct re-planning capabilities for evolving environmental and mission conditions. - A set of dedicated GUIs to allow an easy and ergonomic human mission control.
The software tool is able to detect and identify faults in the sensors and actuators of an autonomous helicopter while it is flying. The tool monitors continuously the helicopter sensor readings and detects automatically when a fault is present in the system. The tool can be implemented onboard the helicopter controller or on the ground station (with reduced effectiveness, due to the transmission delays). The information provided by the software tool (presence and type of fault) can be used by the helicopter controller to reconfigure sensors and actuators, and by the central control station for planning purposes.
The system comprises hardware and software components for fire detection by using a single UAV flying at low altitudes. The system is able to localise the potential alarms in georeferenced coordinates. The hardware components consist of visual and infrared cameras, and a housing with the electronic components including a video server, sensors for vehicle positioning (GPS and low cost IMU) and a communication link. The software components implements computer vision techniques to detect fire alarms from the visual and infrared images. The software also fuses both sources of information to reduce the false alarm ratio. The software can run on a conventional laptop with a wireless link or onboard if the vehicle has enough payload to carry a processor. The positioning data provided by the sensors are used to geolocalise the fire alarms. These alarms can be integrated into a GIS system for forest fire fighting.
The system can match images from widely separated views of the scene, possibly from different UAV’s. The implementation is based on robust blob features. Blobs are perceptually salient, homogeneous, and compact image regions. Each blob is characterized by its colour, its center-point and its shape represented by the inertia matrix. First, correspondences between blobs are obtained by using the mean colour. After that, a RANSAC algorithm refines these correspondences using the position of the blobs. Also, a shape metric derived from the inertia matrix of each blob is used. As a result, a set of correspondences are found and a homography model relating the two views is obtained.

È in corso la ricerca di dati su OpenAIRE...

Si è verificato un errore durante la ricerca dei dati su OpenAIRE

Nessun risultato disponibile