Skip to main content
European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities

Periodic Reporting for period 2 - H-Reality (Mixed Haptic Feedback for Mid-Air Interactions in Virtual and Augmented Realities)

Período documentado: 2019-10-01 hasta 2022-03-31

Digital content today remains focused on visual and auditory stimulation. Even in the realm of VR and AR, sight and sound remain paramount. In contrast, methods for delivering haptic (sense of touch) feedback in commercial media are significantly less advanced than graphical and auditory feedback. Yet without a sense of touch, experiences ultimately feel hollow, virtual realities feel false, and Human-Computer Interaction (HCI) become unintuitive. Our vision was to be the first to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality: H-Reality.

The implications of this technology will be far-reaching. The computer touchscreen will be brought into the third dimension so that swipe gestures will be augmented with instinctive rotational gestures, allowing intuitive manipulation of 3D data sets and strolling about the desktop as a virtual landscape of icons, apps and files. H-Reality will transform online interactions; dangerous machinery will be operated virtually from the safety of the home, and surgeons will practice their skills on thin air.

The overall objectives of the project are three-fold:

1) Create Mixed Haptic Interface (MHI) prototypes, and demonstrate how these can unlock the next generation of HCI applications.

2) Develop ergonomic interaction techniques and vibrotactile libraries for our MHI prototypes.

3) Provide scientific models and empirical analysis that underpin and enable the MHI prototypes.
H-Reality has successfully broken down traditional disciplinary boundaries, bringing together the commercial pioneers of ultrasonic non-contact haptics, state-of-the-art vibrotactile actuators, novel mathematical and tribological modelling of the skin and the mechanics of touch, as well as experts in the psychophysical rendering of sensation. The novel haptic interface that we have developed uses high frequency pulses of air, which was commercialised by Ultraleap Ltd. for non-contact haptic feedback. It operates with a set of miniaturized wearable haptic sensors and actuators, commercialised by Actronika SAS, for contact vibrotactile feedback. The Ultraleap system focuses acoustic pressure to induce microscale skin deformations. When modulated in time and space, these pressure points can be perceived by the brain as textures, such as foam or velvet, or as 3D objects. The Actronika system generates rich vibrotactile input over a large range of frequencies e.g. it can simulate the collision of objects with the hand with a high degree of realism. Through combining these systems, the H-Reality MHI accurately renders the sensation of touch, enabling mid-air interactions with cyber-physical objects in real space without the need for limiting, cumbersome hardware (e.g. force gloves). The result is an untethered experience of virtual objects and surfaces, with the embodiment of their physical properties. The project has empowered users to reach out and interact with a digital reality, perceiving its semantic physical properties, accentuated by synchronised visual and auditory feedback: an immersive haptic reality that we call H-Reality.

The development of the MHIs has been critically dependent on computer software to simultaneously control in a seamless manner both the contact and non-contact devices. The software incorporates the capability to analyse different real and virtual objects in order to find the grasping strategy best matching the resultant haptic pinching sensations. The result is that computational renderings of specific materials can be distinguished via their surface properties.

The research has been underpinned by a vibrotactile library based on measuring the vibrations induced in the hands of subjects as they slide a finger over a particular surface or assess the softness of a material by pressing down with a finger. Advanced mathematical analyses have been developed for processing the results to capture key features that a subject uses in such tactile appraisals. Perceptual limits for materials and objects have been determined for the contacting and non-contacting haptic prototypes. This has led to perceptual verification of device efficacy by employing absolute detection thresholds for the MHI. A virtual hand has been developed that can assess how tactile vibrations would interact with our biological touch sensors (mechanoreceptors). The model is sufficiently powerful that, given the texture of a particular material, it is possible to predict the tactile perception experienced by real subject.

Our results have been made publicly available via our website (www.hreality.eu) that includes the main concepts of the project with photos and videos, details of the team members and contact information, a media page covering demonstrations given at conferences, workshops, and exhibitions, some open source tools and a publication list (6 journal articles and 21 conference papers). There have also been over 30 public presentations.

The best opportunities for exploitation have been identified, which includes gaming, e-commerce, healthcare and the automotive sector. There have been 4 patent applications and 8 innovations recognised by the EU’s Innovation Radar scheme.
Our long-term vision is to enable a completely novel and natural way of interacting with digital content. We foresee a future where data processing is powered by ubiquitous computing and wireless connectivity, and where artificial intelligence (AI) and robotics will reshape the job market. A future where digital content will exist, not only as big data sets, gigapixel images, HD audio and video formats, but in haptic formats. Today’s graphical user interfaces (GUIs) will be replaced by 3D VR/AR interfaces with portable interaction components that support expressive haptic feedback. Our MHIs will address socio-economic and wider societal needs:

* Software programmability and universality: Virtual objects can take many shapes and forms. MHI will render these as well as their textural information.

* Efficient communication, training, and collaboration of complex concepts e.g. it will diminish the transition cost between novice and expert by more effective training.

* Rich haptic VR/AR interfaces can make remote working more efficient and increase user adoption for a variety of tasks (e.g. engineering site visits) and reduce the environmental cost of travel.

* Accessibility and inclusion: Some user groups can be better supported as well as included (e.g. people with visual or auditory impairments, or hand muscle and nerve disorders) by our software programmable haptic interface (e.g. by using advanced sensory-supplementation or substitution techniques).

* Given the success of current force feedback devices, the untethered, ungrounded MHI will greatly improved outcomes of stroke patients.

* Artists, designers, students, educators, and engineers will be able to take advantage of the increased engagement of MHI to offer new powerful active exploration techniques to further stimulate reasoning, creative and analytical skills.
Redefining digital content as something that can be touched and felt.
Exploring a 3D mixed haptic interaction prototype system in VR.
Experiencing a wearable encounter-type haptic interface.
Demonstrating a touchable beating bio-hologram heart.