Descrizione del progetto
Un display multimodale che fluttua nello spazio
Attualmente utilizziamo numerose applicazioni basate sull’interazione touch e touchless, ma mancano di feedback fisico e della fisicità complessiva dell’esperienza. La situazione sta però per cambiare grazie al progetto Levitate, finanziato dall’UE. Il suo obiettivo è sviluppare il primo sistema di interfaccia fisica che rivoluzionerà l’esperienza dell’utente. Il progetto utilizzerà l’audio parametrico per modulare le onde sonore generate dagli atomi in levitazione, consentendo agli utenti di controllare il suono udibile prodotto. Inoltre, la proiezione di immagini su questi atomi fornirà un ulteriore livello di stimolazione sensoriale, dando vita a un’esperienza unica e coinvolgente per gli spettatori. Il prototipo di Levitate aprirà la strada a interazioni virtuali molto simili a quelle della vita reale, in cui la proiezione visiva sugli oggetti crea un ricco display multimodale che fluttua nello spazio.
Obiettivo
"This project will be the first to create, prototype and evaluate a radically new human-computer interaction paradigm that
empowers the unadorned user to reach into levitating matter, see it, feel it, manipulate it and hear it. Our users can interact
with the system in a walk-up-and-use manner without any user instrumentation.
As we are moving away from keyboards and mice to touch and touchless interactions, ironically, the main limit is the lack of
any physicality and co-located feedback. In this project, we propose a highly novel vision of bringing the physical interface to
the user in mid-air. In our vision, the computer can control the existence, form, and appearance of complex levitating objects
composed of ""levitating atoms"". Users can reach into the levitating matter, feel it, manipulate it, and hear how they deform it
with all feedback originating from the levitating object's position in mid-air, as it would with objects in real life. This will
completely change how people use technology as it will be the first time that they can interact with technology in the same
way they would with real objects in their natural environment.
We will draw on our understanding of acoustics to implement all of the components in a radically new approach. In particular,
we will draw on ultrasound beam-forming and manipulation techniques to create acoustic forces that can levitate particles
and to provide directional audio cues. By using a phased array of ultrasound transducers, the team will create levitating
objects that can be individually controlled and at the same time create tactile feedback when the user manipulates these
levitating objects. We will then demonstrate that the levitating atoms can each become sound sources through the use of
parametric audio with our ultrasound array serving as the carrier of the audible sound. We will visually project onto the objects to create a rich multimodal display floating in space."
Campo scientifico
- natural sciencescomputer and information sciencessoftware
- natural sciencesbiological sciencesbiochemistrybiomoleculesproteins
- engineering and technologyelectrical engineering, electronic engineering, information engineeringinformation engineeringtelecommunicationsmobile phones
- natural sciencesphysical sciencesacousticsultrasound
Parole chiave
Programma(i)
Invito a presentare proposte
Vedi altri progetti per questo bandoBando secondario
H2020-FETOPEN-1-2016-2017
Meccanismo di finanziamento
RIA - Research and Innovation actionCoordinatore
G12 8QQ Glasgow
Regno Unito