Hand-to-object interaction reaches new heights thanks to VirtualGrasp
One of the main challenges facing Virtual Reality (VR) application developers is hand-to-object interaction. Sure, VR headsets do a great job of immersing us into realistic virtual worlds. They have even become a must for professional training, especially amidst this unprecedented pandemic. But in both cases – just like in other VR application fields – hand-to-object interaction is far from feeling as natural as it does in the real world. Jakob Way, CEO of VR and robotics start-up Gleechi, summarises the current state of play: “VR training participants primarily use handheld controllers. They must be able to experience natural hand interaction to develop new skills and have the confidence to apply them in the real world. Meanwhile, although VR game players can already interact with objects, manually creating and animating hand interaction has so far proven impossible due to the time needed to manually animate many different grasps for all different objects.” The latter is key to understanding the current limitations of hand-to-object interaction. For each possible interaction, developers currently must define how an object can be grasped before manually animating these grasps. This inherently limits interaction: Objects can only be used in predetermined ways, and getting to these mixed results is very time-consuming.
Algorithm-based interaction builder
This is where VirtualGrasp (Speeding up the virtual reality revolution with realistic & real-time animation of hand-to-object interaction) comes into its own. By combining machine learning and predictive algorithms, Gleechi provides VR users with complete freedom of interaction. VirtualGrasp is particularly applicable in three distinct fields of applications: training, games and stroke rehabilitation. For training, the team developed an in-depth understanding of how each object is held and used, which is vital to providing a truly immersive learning experience. For VR games, VirtualGrasp manages to automate the creation of interactions between hands and 3D objects. Finally, for stroke patients, the system can predict patient grasps and convert them into VR interactions. Way explains: “Patients have been able to conduct rehabilitation exercises in a VR environment where they would perform motivating tasks such as playing games or planting flowers. However, due to limited mobility in the patient’s hands and fingers, they often struggle to perform the required precision grasps. By predicting these grasps, we hope to accelerate the rehabilitation process through visual amplification.” Trial results in these three fields have been very encouraging, but COVID-19 lockdowns resulted in an increased focus on commercial applications in the field of training. There, VirtualGrasp was proved to enhance the impact of VR training and significantly simplify the creation of VR training applications. After they observed a 50 % increase in knowledge retention for VR training participants, Gleechi expedited the development of self-service tooling to enable industrial companies to create their own VR training. “SAAB Aeronautics is actively testing VirtualGrasp for advanced VR training where participants learn to use specialised tools and equipment during the assembly process,” Way says. “The natural interaction makes it possible for SAAB Aeronautics to deliver remote training while retaining exceptionally high standards of quality.” Likewise, YrkesAkademin – a major provider of labour market training – has been using VirtualGrasp to rapidly provide vital skills to healthcare workers with a focus on working in sterile environments. “The training requires careful handling of complex medical tools and equipment where following procedures is essential to maintain a sterile environment. By being able to interact naturally, it becomes possible for participants to develop practical experience where access to real-world sterile environments for training purposes is problematic,” Way notes. Gleechi has raised EUR 2.4 million in funding to start commercialising its software for VR training, and R&D work will continue thanks to an additional grant project aiming to test the same interaction technology in the field of robotics.
Keywords
VirtualGrasp, virtual reality, VR, hand-to-object interaction, algorithms, training, Gleechi