Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Content archived on 2024-06-18

Sensory integration for limb localization and action

Article Category

Article available in the following languages:

How we locate limbs improves computer interfaces

The ability to sense the location, orientation and movement of the body and its parts is known as proprioception and it enables us to know the precise position of our limbs. An EU-funded initiative investigated the phenomena in order to improve interaction between people and computers.

Digital Economy icon Digital Economy

Thanks to proprioception, we can easily describe the location of our left hand or point to it with the right, even with our eyes closed. What's more, we never feel that our hand is in two places at once, even though we simultaneously process visual and proprioceptive information about its current location. The EU-funded LOCANDMOTION (Sensory integration for limb localization and action) project investigated how vision and proprioception are combined to generate estimates of hand and target locations. The initiative tested the hypothesis that proprioception resists visual realignment and that intersensory alignment is not needed for effective action. A study of the relevant literature and experiments showed that proprioception may not be very useful for identifying the position of the hand relative to a target. In fact, multiple movements made in the dark can result in the hand drifting from the target without its owner realising it. Researchers showed that this drift is due to an individual's prior belief that their ability to find a target is very high, possibly a result of their daily experience with accurate full-vision movements. The movement errors accumulate until error from proprioception counteracts trust in the motor command (which tells the hand where to go). A separate study where participants were asked to report the perceived position of their unseen hand revealed that expectation about the ability of motor commands may override sensory input. This and the previous study involved identifying the position of the stationary hand after movement. Another two studies focused on localisation during movement. Results from the first study showed that if the target was visual, then visual information was more important for online control, but if the target was proprioceptive participants relied more on proprioception for online control. The second study described a model for explaining why people tend to overestimate the position of their unseen hand during movement and why they underestimate when reaching for a target. LOCANDMOTION has increased understanding of how people use proprioception, which was not as well understood as vision. This knowledge of sensory integration will help to improve how people adapt to virtual reality, mixed reality and teleoperation systems, resulting in improved designs for human-computer interfaces.

Keywords

Computer interfaces, proprioception, sensory integration, limb localization, motor command

Discover other articles in the same domain of application