Synthesising haptic interactions
The TOUCH-HAPSYS project created a new breed of high-fidelity haptic display technologies to address haptic interaction as well as to enhance haptic information via visual and auditory input. New technologies were developed and explored in order to enhance haptic displays. Additionally, the psychophysical basis of people's haptic perception was examined. The idea that this is based on is using haptic illusions to overcome fundamental technological limitations. The investigation of how the sensation of haptic presence can be generated with relation to vision was a primary objective. Fundamentals of touch phenomena were researched and necessary cues for generating the sensation of being present and of that involved in touch were verified. Additionally, interactions of touch and vision in virtual environments were examined in order to achieve the optimum feeling of immersion and presence. Emphasis was placed on object representation and recognition, attention and information integrated from vision and touch. In order to validate the developed methods and for them to evolve into the next generation of haptic systems, demonstrators were designed to serve as platforms. One of these is the I-TOUCH software which can be employed to produce sample haptic applications with both commercial and non-commercial devices. Another method that was developed is the interactive 3D data navigation system which has already been applied in a clinical setting. This multi-modal segmentation system which can be used with arbitrary haptic devices has been ported to the Linux OS so that purchase costs can be reduced. Many possibilities continue to emerge in terms of the potential of practical and commercially usable haptic feedback systems. This is good news for device manufacturers and health care institutions as well.