Periodic Reporting for period 2 - EXTEND (Resolution Revolution to Extend Reality)
Periodo di rendicontazione: 2019-01-01 al 2020-03-31
By the start of the project we were finalising the launch of 1st Generation commercial product that utilizes our Bionic DisplayTM. The SMEI project focused on the next big step in our business strategy - commercialising our video-see-through technology and becoming an XR solution provider.
The project was planned as a 24-month action and the project activities were divided into three categories:
1. Product development
Finalization of the product development of the video-see-through front-plate hardware and design and implement hand-tracking that truly complement the unparalleled potential of our human-eye-resolution capable product. Build software for operating both and prepare for mass-manufacturing.
2. Pilots
Pilot our XR system with first professional users to collect and analyse market segment specific customer feedback.
3. Commercialisation activities to support the market launch
To take a full advantage of the improved product offering, update commercialisation plan, carry out wide-scale dissemination and communication activities among potential customers, and build strong sales channels suitable for global commercialisation.
The expected outcomes of the EXTEND project were:
• Market ready XR system (head-mounted display with hand-tracking);
• Capability to start mass-manufacturing of the XR system;
• Strong market reference and validation for XR system in form of successful pilots;
• Wholesale agreements and VAR network suitable for global expansion.
By the end of the project, we can state that we have achieved the desired results regarding WP1 and since Q4/2019, we started mass production. Project execution took slightly longer than originally anticipated requiring extension of three months for the work package. This related to different aspects of product and production quality and maturity. With the extension, we were able to reach higher product quality for the sales start. We have demonstrated technological leadership in the field.
Work Package 2
In WP2, we made the biggest changed during the project – we shifted our focus from developing a physical controller to integrated hand-tracking. We were able to put in place the needed enablers for hand-tracking from the hardware perspective. From software perspective, we were able to deliver a lot of features related to hand-tracking, including hand occlusion and depth map, which were seen as the critical needs of the target customers in their use cases.
While we weren’t able to release gestures and interactions with hands by the end of the project, we managed to create most of the enablers to do so in the future.
Work Package 3
WP3 concentrated on pilots, which turned out to be very beneficial for the project overall. We gained a lot of valuable insights about the use cases and needs of potential customers, about the functionality and usability of our product, about compatibility and performance of different setups both from software and hardware perspective, and about maturity and technical readiness of our hardware and software. Pilot partners benefited in form of getting early access to the technology and partly our support for their use case development.
Initial thinking of the partners needed to be revised during the project as we learned more about the capabilities of and willingness to put an effort into the project by different partners. Ultimately we feel that the selected partners were the right ones to reach the best possible outcome for the project and Varjo overall.
Work Package 4
We launched XR-1 in May 2019 and announced commercial availability and started shipments in Dec 2019. High-quality materials were produced in form of press and media materials, videos, social media posts and web site. Varjo was presented in multiple important trade shows during the project.
We built a strong sales channel of over 25 value-added resellers enabling local presence in our key markets.
One of the key metrics for video-see-through (VST) is latency, from the time of light hitting sensor to the representation of the light being emitted from the display. We have achieved an imperceptible delay, which keeps the visual true-to-life. Additionally, Varjo has made new algorithms that fuse the data of traditional VR tracking and enable that to register virtual objects accurately and stability in the real world.
Our mixed reality software has taken huge steps forward during the project. One example is that Varjo was the first company to deliver chroma keying (industry-standard technique called ‘green-screenin’) in real-time for mixed reality devices. This enables seamless blending of real and virtual objects in mixed reality experience. Using visual markers – another software feature from Varjo – professional users can instantly anchor any virtual objects to the real world.
Varjo has co-operated in industry-leading partners in multiple fields and used the gained insight in the development work. These activities are yielding into fruitful long-term partnerships both in terms of product and technology development as well as commercially.