Periodic Reporting for period 2 - Levitate (Levitation with localised tactile and audio feedback for mid-air interactions)
Période du rapport: 2019-01-01 au 2021-03-31
WP2 is designing and implementing physical prototypes to enable new user interfaces. We have taken two general approaches: improving the control mechanisms for transducer arrays, and using physical metamaterials to create complex sound fields. The former method has been used in a variety of novel interfaces, e.g. to levitate food particles and to create interactive levitating objects. The latter approach has been used to create complex sound fields using diverse fabrication approaches, including optimised reflective surfaces and metamaterial platters that sit on top of a standard ultrasound array. A key achievement in this work was creating a self-bending sound beam, enabling a complex sound field to be created behind an obstacle. This method can also be applied to modality combinations, e.g. bending a haptic sound field around a levitating object. The physical prototypes developed in WP2 are enabling new user interfaces and interaction techniques, which are investigated by WP3 and WP4.
WP3 is creating high quality multisensory user interfaces. As part of this work, we have investigated ways of improving ultrasound haptic feedback, which provides the important physical aspect of our multisensory applications. New haptic rendering techniques have been developed and evaluated. Findings from WP3 have identified requirements for better physical prototypes (WP2), e.g. new haptic rendering firmware was developed to improve the quality of feedback. Another important aspect of WP3 is to evaluate the interaction techniques we develop in WP4. A series of user studies have investigated the performance and user satisfaction of new interaction techniques. These studies show the efficacy of our new interaction techniques and their findings are informing the design of new interfaces based on combinations of levitation, haptics, and parametric audio.
WP4 is developing and refining techniques to enable users to interact with our new user interfaces. Interfaces composed of levitating objects are highly novel and applying existing input techniques is not straightforward. We started with object selection, a fundamental part of interacting with any type of display. Selecting an object for manipulation is not trivial because users cannot necessarily touch the levitating objects; instead, we used mid-air pointing for selection. Our pointing technique and novel feedback mechanism was successful, as verified through WP3 user studies. We then extended this technique to allow users to reposition a levitating object, by mapping the object position to an extended fingertip position. A user study investigated the degree of control users could exert over a levitating object. In the next period, the project will expand the vocabulary of interaction techniques, so that users can fully manipulate content composed of levitating objects (e.g. to rotate or scale them). Demonstrators in this work package showcase the new user interfaces we are developing on the LEVITATE project.
A key aim of the project is to disseminate and communicate our work to a broad variety of audiences (WP5). We have undertaken several communication activities to engage with the general public, especially younger people. Through science fairs, workshops and television appearances, we have communicated the novel aspects of our research in a non-academic context. We aim to educate and inspire the next generation of scientists through thought provoking demonstrations of what user interfaces of the future might look, feel and sound like. We are also advancing our research agenda in the scientific community, with strong publications and dissemination activities in the HCI, haptics and acoustics fields. We also communicate our work online, through an engaging project website and active social media profile.