Periodic Reporting for period 2 - HARMONY (Enhancing Healthcare with Assistive Robotic Mobile Manipulation)
Período documentado: 2022-01-01 hasta 2022-12-31
WP1 (System requirements and evaluation) and WP2 (System architecture and integration) identified the key steps required by the use cases, including opening boxes, manipulating their content, and transporting it. Bringing mobility and manipulation together was deemed necessary to enable the flexibility required to deal with changes in workload. In the second period, we refined the use cases and adjusted them based on initial learnings. Several integration meetings were held, first to incorporate the sensor stack on the two platforms before joining several technical components together to execute more complex tasks, such as navigating in dynamic environments or locating and manipulating boxes.
WP3 (Perception) developed the Harmony sensor suite prototype based on the requirements stipulated by the use cases and the associated Harmony technologies. The system provides all-around camera and LiDAR coverage with forward-facing RGB-D sensing to enable situational awareness and manipulation and mapping operations. The sensors have been validated on several platforms. Components processing the data include a vision-based object semantic and instance segmentation system and pose estimation method. Objects of interest are represented in a multi-modal database.
WP4 (Localisation and mapping) worked on improving localisation in dynamic environments by combining odometry information with object information in a Monte-Carlocalisationion framework. The results confirm the benefits of multi-modal odometry estimation, demonstrating robustness to different failure cases. Mapping approaches developed using object information have produced robust mapping performance in dynamic environments. Tools to annotate such object-based maps by non-experts have been created, and the use of neural implicit representations, as used in WP3, is being investigated.
WP5 (Planning and scheduling) developed a model predictive control (MPC) system that leverages deep reinforcement learning to improve navigation in crowded environments. This method can learn from experience and predicts human motions, taking them into account to navigate in a socially acceptable manner. A second aspect being worked on is the development of high-level planning that allocates a fleet of robots optimally to a set of delivery tasks. The allocation considers human preferences and prior knowledge about the environment, such as congestion data. Work has also started on combining human supervision with task automation efficiently.
WP6 (Grasping and manipulation) has developed dexterous and tactile grippers enabling delicate manipulation of objects. In parallel with these hardware developments, work on software facilitating immersive control interface developments was conducted. These interfaces allow collecting datasets of human demonstrations for learning-based control methods. Methods combining higher-level reasoning interwoven with control have also been explored to optimise long-horizon manipulation tasks.
WP7 (Whole-body control) worked on the full-body dynamical model identification of the ABB mobile YuMi system, which is used in subsequent whole-body control. As part of this work, low-level torque control interfaces were developed, which enable control compliance, improving robustness to perception inaccuracies. Non-prehensile manipulation received much attention, enabling the manipulation of objects without having a controlling grasp on them. Finally, work on exploiting human demonstrations and transferring them onto the robot’s morphology has started.
WP8 (Safety and acceptability) aims to better understand the context-of-use, the users, and set the stage for conducting user studies in a safe and ethical manner. This involved interviews of hospital staff regarding their current work practices, how they envisioned their work in the future, and the risk evaluation of the Harmony use cases. The use of semantic free speech is being explored to give a “voice” to the robot. These efforts were integrated into the design of a novel robot for hospital assistance which will be used for further studies.
Specifically, Harmony is extending the state of the art by:
- Formalising object-based (as opposed to classical feature-based) world representations for robotic perception,
- Developing novel robot localisation and mapping algorithms that exploit object-based world representations,
- Providing new algorithms for coupling task scheduling and motion planning that provide adaptive, congestion-free plans in shared human spaces,
- Developing an immersive control interface that provides a natural and intuitive way for medical staff to deliver and verify new robotic manipulation capabilities at aided, semi-, and autonomous levels,
- Developing robust and compliant whole-body motion planning and control for interacting with unknown objects while in close proximity or direct collaboration with human co-workers,
- Delivering strict guarantees on aspects such as safety, accuracy and patient privacy during navigation, handover, carrying and co-working tasks.
Together, the capabilities that we develop throughout Harmony will lead to faster industry take-up of assistive robotic technologies in healthcare environments and beyond.