Skip to main content
European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Zawartość zarchiwizowana w dniu 2024-06-18

GAZE-BASED PERCEPTUAL AUGMENTATION

Final Report Summary - DEEPVIEW (GAZE-BASED PERCEPTUAL AUGMENTATION)

Deepview investigated and developed novel interactive displays which take advantage of new possibilities offered by gaze-tracking technologies. Gaze-tracking input (computer input that is based on where the user is looking at on the screen at any given time) is becoming available to regular consumers through inexpensive input devices such as the Tobii EyeX, and therefore this project was particularly timely. This investigation looked at different ways in which combining dynamically-changing images with eye trackers could provide perceptual benefits. The team laid the foundations and understanding necessary to take advantage of this technology in the future. In the first part of the project they looked at using a gaze-contingent display and blur to simulate accommodation.

Initially they developed processing software for images and data, and built multiple versions of a full display system that provides gaze-contingent depth-of-field. These versions have allowed the team to perform an experiment on gaze-contingent depth-of-field that indicates that this method results in a stronger subjective perception of depth when used to represent realistic images. A second experiment shows that using gaze-contingent depth of field in isolation can provide some information about depth but is, in itself, probably too unreliable to be successfully used in applications where accuracy is important (for example, information visualization). A third experiment shows that chromatic aberration is not likely to help make gaze-contingent depth-of-field be much more precise than what was found in previous experiments in DeepView, since it does not seem to be very helpful to solve its inherent sign ambiguity.

The latter part of the project, which just concluded, examined instead gaze-contingent contrast and colour. The team looked at whether manipulating colour and brightness of peripheral areas of vision (those areas of the image that are not currently been looked at) helps increase perceived differences in represented colours. The DeepView team carried out two experiments. The first investigated if changing peripheral colour or brightness dynamically can change the perception of the colour that is being looked at. The evidence is conclusive, and the answer is yes. This result is important because it opens the door to the realisation of high-dynamic range displays and extended colour gamut displays by adding eye-tracking capabilities and without having to change the physical (optical) capabilities of displays at all. The impact of this important result is just starting to be explored in follow on work.
The second study looked at whether it is possible to leverage these gaze-contingent peripheral colour and brightness manipulations to increase viewers’ ability to distinguish colours. The ability to distinguish colours from each other is valuable, for example, in information visualisations where colour encodes quantities. For example, in heat maps of the ocean temperature. We demonstrated techniques that can increase the differentiability of colour, but these were not the same techniques from the previous experiment that could extend the colour gamut. The results are promising but also indicate that gaze-contingent manipulation is not trivial and that more evidence is needed to be able to harness it efficiently in real-life scenarios.
Simultaneously to performing these experiments, in this second part of the project the team developed software that allows the public to experience the gaze-contingent techniques developed in the first part of the project. The DeepView team developed and released GAZER, an open-source application that is freely available to download through the project’s website (http://deepview.cs.st-andrews.ac.uk) and that enables anyone to experience gaze-contingent depth perception though blur and is the first available application that combines light-field photography and eye-tracking for a new way of perceiving 3D in pictures.

This CIG has been instrumental to reintegrate the fellow. The fellow’s initial contract was for five years, but thanks to the support provided, the contract has been made permanent. Moreover, the initial support provided by the fellowship has enabled Dr Nacenta to develop a world-leading programme of research and an independent research group that regularly publishes at the top publications in the area.