Skip to main content
European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

ReliablE in-Vehicle pErception and decisioN-making in complex environmenTal conditionS

Periodic Reporting for period 1 - EVENTS (ReliablE in-Vehicle pErception and decisioN-making in complex environmenTal conditionS)

Okres sprawozdawczy: 2022-09-01 do 2024-02-29

In our everyday life as drivers, we are facing unexpected situations we need to handle in a safe and efficient way. The same is valid for Connected and Automated Vehicles (CAVs), which also need to handle these situations, to a certain extent, depending on their automation level. The higher the automation level is, the higher the expectations for the system to cope with these situations are. In the context of this project, these unexpected situations where the normal operation of the CAV is close to be disrupted (e.g. ODD limit is reached due to traffic changes, harsh weather/light conditions, imperfect data, sensor/communication failures, etc.), are called “events”. EVENTS is also the acronym of this project.
Within our scope, and, to cover a wide area of scenarios, these kinds of events are clustered under three main use cases: a) Interaction with VRUs and other vehicles, b) Non-Standard and Unstructured Road Conditions and c) Low Visibility and Adverse Weather Conditions.
EVENTS aims to create a robust and self-resilient perception and decision-making system for AVs to manage different kind of “events” on the horizon. These events result in reaching the AV ODD limitations due to the dynamic changing road environment (VRUs, obstacles) and/or due to imperfect data (e.g. sensor and communication failures). The AV should continue and operate safely no matter what. When the system cannot handle the situation, an improved minimum risk manoeuvre should be put in place.
The overall objectives of EVENTS can be summarized as follows:
1. Design and implement on-board perception algorithms needed for safe driving of CAVs in complex environmental conditions.
2. Design and implement decision-making algorithms able to cope with a variety of traffic scenarios including non-standard traffic conditions.
3. Develop solutions for continuous perception system self-assessment for safe and resilient operation.
4. Integrate, test and demonstrate the developed perception and decision-making algorithms in both prototype vehicles (real conditions) and simulation environments.
5. Assess the impact of EVENTS developments and determine cost-efficient sensor suites for CAVs.
6. Disseminate and communicate project findings, increase cooperation with international stakeholders and promote project results to standardisation bodies.
During the first 18 months of the project, the Use Cases (UCs), initially described in the grant agreement of the project, were fine-tuned, which led to the definition of the 8 experiments of EVENTS. In turn, having well-defined experiments enabled the efficient and effective elicitation of requirements for said experiments and consequently in forming the system/master (and subsystem/experiment-level) architecture. The architecture in combination with the requirements also allowed the identification of risks and potential hazards associated with each UC.
With regards to perception, EVENTS started by exploring a large variety of public datasets, which led to the synthetic augmentation of pre-existing datasets using ML methods and simulations, the creation of new datasets (road debris) and the application of self-supervised and cross-modal supervised learning on a jointly camera and LiDAR sensor modality. Additionally, several methods were developed for the detection of road users, traffic signs, road debris and “ghost” reflections as well as for obtaining the current environment state (incl. that of all relevant road users) and the prediction how the environment state will evolve over time. Moreover, different methods have been applied for the generation and fusion of collective perception information in different scenarios (e.g. roundabout, unclear intersection) through the integration of V2X messages. Finally, mechanisms for the integrity monitoring (self-assessment) of the perception and localisation systems of the automated/autonomous vehicles are being developed.
With regards to decision-making, based on the master architecture, the motion planning algorithms on the different experiments were designed and are currently implemented, enabling the collision avoidance with other vehicles, VRUs and cyclists as well as lane merging and avoiding the abrupt behaviour of the vehicle. Moreover, based on the motion planning modules, the behavioural decision-making modules of CAVs are being developed.
The work performed in perception and in decision-making has already started to be implemented both in simulations and in the EVENTS’ demo vehicles. Said implementation is either fully completed or almost completed in all experiments and in the forthcoming months, the testing will start.
Lastly, one of the most important components of every AD system is the evaluation plan. EVENTS’ evaluation plan is currently being developed and will soon be available for all 8 experiments.
Beyond state of the art on perception, EVENTS develops:
• Novel ML-based scene recognition algorithms to extend the robustness of AD functions under different environmental conditions via:
o Scene completion when some are occluded or not visible due to bad weather.
o Increase of training data of object detectors and their performance in low visibility and adverse weather conditions.
• Joint VRU perception and intention prediction algorithms dealing with scarcity of annotated data.
• A new benchmark dataset, which includes annotated VRUs instances based on camera, radar and LiDAR-based perception to enable performance comparison of various sensor combinations.
• A new and original dataset on road debris, in which 47 different objects were used during data collection under light-rain and dry weather conditions.
• Collective perception (CP) scenario-based testing for different use cases (e.g. roundabout navigation) using Bayesian filters.
• Computation-efficient on-board CP messages fusion.
• High precision localisation under harsh weather conditions via dense 3D mapping of the surroundings using multiple sensors (camera, LiDAR, radar).
• VRUs/vehicles robust motion prediction incorporating: i. new spatio-temporal features, ii. integrating high precision 3D map information, iii. handling of multiple objects’ tracking, iv. novel ML solutions to understand the intention of VRUs based on their body language.
• Joint scene prediction and ego-vehicle manoeuvre planning algorithms.
• A two-layer perception data runtime monitoring for both sensors’ data and perception object data respectively (self-assessment) using subjective logic methods and ML-based anomaly detection.

Beyond state of the art on decision-making, EVENTS develops:
• High-level manoeuvres that a non-standard traffic condition requires and deciding whether a minimum risk manoeuvre is necessary.
• Specific motion trajectory that needs to be executed in the presence of uncertainties from perception.
• Improved and dynamically adaptive decision-making under harsh environmental conditions.
• Improved and dynamically adaptive motion planning algorithm based on slippery road conditions.
• Fail-safe planning despite occluded objects and incomplete data, e.g. missing lane markings.
• Adaptive and real-time emergency motion planning based on current road conditions.
• Robust dynamic-model based controller which applies gradual degradation of the AD function and application of an improved minimum risk manoeuvre in case the system self-assessment is advising to do so.