Skip to main content
European Commission logo
italiano italiano
CORDIS - Risultati della ricerca dell’UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

InteractiveSkin: Digital Fabrication of Personalized On-Body User Interfaces

Periodic Reporting for period 4 - InteractiveSkin (InteractiveSkin: Digital Fabrication of Personalized On-Body User Interfaces)

Periodo di rendicontazione: 2021-07-01 al 2022-06-30

User interfaces are moving onto the human body. However, today’s rigid and mass-fabricated devices do not conform closely to the body, nor are they customized to fit individual users. This drastically restricts their interactive capabilities.

This project aims to lay the foundations for a new generation of body-worn UIs: interactive skin. Our approach is unique in proposing computational design and rapid manufacturing of stretchable electronics as a means to customize on-body UIs. Our vision is that laypeople design highly personalized interactive skin devices in a software tool and then print them. Interactive skin has the advantage of being very thin, stretchable, of custom geometry, with embedded sensors and output components. This allows it to be used as highly conformal interactive patches on various body locations, for many mobility tasks, leveraging the many degrees of freedom of body interaction.

This project makes contributions at the intersection of on-body interaction, stretchable electronics, and digital fabrication: 1) We contribute an automatic method to generate printable electronic layouts for interactive skin from a high-level design specification. 2) We contribute multimodal interaction primitives that address the unique challenges of skin interaction. 3) We develop principles for design tools that allow end-users to easily design a personalized interactive skin device. 4) We use the newly developed methodology to realize and empirically evaluate interactive skin in unsolved application cases. With this approach, we contribute to a deep and systematic understanding of the on-body interaction space and show how to build UIs with unprecedented body compatibility and interactive capabilities.
This project has made the following main contributions to the field of interactive skin:

1. Computational fabrication of interactive skin
We developed functional designs for sensors and output components that can be embedded in interactive skin devices. These components are ultra-thin (typically between 1 and 50 microns) and elastic, so the device can closely conform to the user’s skin and can be ergonomically worn even on highly curved and stretchable body locations. As skin is inherently multi-modal and our goal is to support rich and expressive interfaces, our results contributed designs for a wide variety of components. These include multi-touch, sliding, bending, squeezing, stretching, and pulling input, as well as physiological sensors for electromyography, heart rate and electro-dermal activity. A highlight result is Tacttoo, the thinnest wearable tactile matrix interface presented in the literature. It allows the user to feel real-world objects through the interface, while these can be augmented with computer-generated tactile output. Our vision is to ultimately print a personalized electronic device similarly to how we today print a document, or even how we apply make-up. To this end, we have developed an award-winning method that allows people to use an ordinary desktop inkjet printer to print functional interactive skin devices within minutes. More recently, we have developed a method for fabricating interactive skin devices by sketching directly on one’s body.

2. Interacting with skin interfaces
Skin is a fundamentally different interaction surface than conventional touchscreens. A central aim of this project was to investigate how we can leverage the unique properties of skin for user interaction. For one, we proposed and investigated the concept of body landmarks: haptic or visual features of the human body, such as knuckles or wrinkles, that can be used to guide interaction on the skin. A set of interaction techniques and functional prototypes demonstrated the power of this principle. Second, we have investigated how the skin on user’s fingers can offer a surface for microgestures performed between fingers. Such gestures offer a rapid, versatile and discreet way of input in mobile contexts. We have conceptually explored the design space of these gestures and demonstrated new gestures for settings when hands are busy holding an object. Moreover, we have conducted in-depth investigations of deformation-based input and output. Their results can lead to new and better ways of interacting on the skin than through conventional touch contact alone.

3. Design tools for personalized interactive skin
Our goal is to offer computational support for a wide audience that helps with designing personalized and customized interactive skin devices. This includes personalization to a specific user’s body proportions, customization to specific application contexts, to specific body locations or desired aesthetic goals. We aimed at establishing the foundations for tools that algorithmically generate functional devices based on a high-level design specification. We have contributed algorithmic principles and foundational design tools for generating custom-shaped multi-touch sensors, electro-physiologial sensor tattoos and tactile output devices. Together, our approaches allow non-experts to readily design and fabricate interactive skin devices with desired properties by deferring the detailed technical realization to a computational tool.

4. Evaluation and synthesis of findings
To systematically investigate the usability of on-body user interfaces and derive implications for future designs, we have conducted extensive empirical studies. Highlight results comprise a systematic investigation of the effect of interactive skin devices on the user’s haptic perception. This knowledge informs the material design of interactive skin devices. Furthermore, we have investigated the use of interactive skin in several real-world application cases, including an award-winning paper on text entry. In the last year of the project, we have synthesized our findings and have identified opportunities and challenges for future work in epidermal computing.
Results from the project contribute to a deep and systematic understanding of the on-body interaction space and show how to build user interfaces with unprecedented body compatibility and interactive capabilities. Our results show that computational fabrication offers a strong complement to existing mass-manufacturing of wearable devices. The project has contributed methods needed for realizing a full end-to-end pipeline: first, laypeople specify the desired high-level properties of the device in an interactive, user-friendly design tool; next, the tool with built-in models of electronic components and human anatomy automatically generates a valid electronic layout that is printable, before, finally, a commodity printer realizes a customized, micron-thin interactive surface that can be applied to the body. We have also presented several breakthrough results for interactive, multi-modal skin interfaces and for feel-through tactile output, and have demonstrated the applicability of interactive skin for real-world use cases in mobile computing, AR/VR, body monitoring and body aesthetics.
High-level graphical design tool for physiological sensor tattoos
Fabricating interactive skin directly on the body
Fabricating micron-thin interactive skin devices with an off-the-shelf inkjet printer
Sensors and output components of interactive skin