Within the TH project, we investigated human haptic and multimodal perception. We verified that humans often integrate multimodal information in a statistical optimal fashion. This is true for spatial properties as well as temporal properties. For describing our integration results we developed quantitative/statistical models (Maximum-Likelihood, Bayesian estimation).
Furthermore, we verified the hypothesis that object recognition, movement detection and spatial localization might be commonly processed in the visual ventral and dorsal extrastriate cortical areas independently from the sensory modality, conveying the information to the brain.
For instance, we used fMRI to measure patterns of response evoked during visual and tactile recognition of faces and manmade objects in sighted subjects and during tactile recognition in blind subjects. Visual and tactile recognition evoked category-related patterns of response that were correlated across modality for manmade objects in the inferior temporal gyrus in sighted and blind individuals. Blind adults also showed category-related patterns in the fusiform gyrus signifying that these patterns are not due to visual imagery and do not require visual experience to develop.
Also the dorsal extrastriate cortical areas and the MT/V5 cortex are involved both in visual and tactile spatial discrimination and movement detection tasks, both in sighted and congenitally blind individuals.
These brain areas sharing supramodal features should be able to integrate information conveyed from different sensory modalities (vision and touch) or from multiple receptor sites within the same modality (touch) to estimate the properties of an object (e.g., size, shape, position, motion).
We studied the psychophysical and functional correlates of visual-haptic integration. The fMRI results correlations between neural activity and cue weights in areas involved in shape processing during bimodal but not during unimodal sensory stimulation may indicate that these cortical areas are involved in multisensory integration of shape information. In addition we showed, with psychophysical studies, how unreliable haptic position and slope information to the fingers is integrated over space and time and, with imaging and neuropsychological studies, how posterior parietal regions are involved in these functions.