Skip to main content
European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Enhancing Healthcare with Assistive Robotic Mobile Manipulation

Periodic Reporting for period 2 - HARMONY (Enhancing Healthcare with Assistive Robotic Mobile Manipulation)

Okres sprawozdawczy: 2022-01-01 do 2022-12-31

Harmony will develop assistive robotic mobile manipulation technologies for use in hospital environments. Our two targeted use cases, 1) the automation of on-demand delivery tasks around the hospital, and 2) the automation of bioassay sample flow, highlight existing processes where there is a need for fast, reliable and flexible automation to undertake the dull and repetitive tasks that are currently conducted by over-qualified staff. While existing systems can automate parts of these processes, these form “islands of automation” that are limited in scope, rigid to changing demands, and still rely on staff to manually distribute goods and samples across the islands. Mobile manipulation technology is a compelling solution to this problem since it offers the capability to bridge these gaps while maintaining a high degree of flexibility to adjust to varying service demands and adapt to different user requirements and preferences.
In the first two years, the project defined the system-level requirements of the Harmony use cases. These requirements inform hardware and software research. User studies performed in parallel provide insight into aspects to consider when designing a new robotic platform. The research efforts in the technical work packages resulted in the publication of a great number of scientific publications.

WP1 (System requirements and evaluation) and WP2 (System architecture and integration) identified the key steps required by the use cases, including opening boxes, manipulating their content, and transporting it. Bringing mobility and manipulation together was deemed necessary to enable the flexibility required to deal with changes in workload. In the second period, we refined the use cases and adjusted them based on initial learnings. Several integration meetings were held, first to incorporate the sensor stack on the two platforms before joining several technical components together to execute more complex tasks, such as navigating in dynamic environments or locating and manipulating boxes.

WP3 (Perception) developed the Harmony sensor suite prototype based on the requirements stipulated by the use cases and the associated Harmony technologies. The system provides all-around camera and LiDAR coverage with forward-facing RGB-D sensing to enable situational awareness and manipulation and mapping operations. The sensors have been validated on several platforms. Components processing the data include a vision-based object semantic and instance segmentation system and pose estimation method. Objects of interest are represented in a multi-modal database.

WP4 (Localisation and mapping) worked on improving localisation in dynamic environments by combining odometry information with object information in a Monte-Carlocalisationion framework. The results confirm the benefits of multi-modal odometry estimation, demonstrating robustness to different failure cases. Mapping approaches developed using object information have produced robust mapping performance in dynamic environments. Tools to annotate such object-based maps by non-experts have been created, and the use of neural implicit representations, as used in WP3, is being investigated.

WP5 (Planning and scheduling) developed a model predictive control (MPC) system that leverages deep reinforcement learning to improve navigation in crowded environments. This method can learn from experience and predicts human motions, taking them into account to navigate in a socially acceptable manner. A second aspect being worked on is the development of high-level planning that allocates a fleet of robots optimally to a set of delivery tasks. The allocation considers human preferences and prior knowledge about the environment, such as congestion data. Work has also started on combining human supervision with task automation efficiently.

WP6 (Grasping and manipulation) has developed dexterous and tactile grippers enabling delicate manipulation of objects. In parallel with these hardware developments, work on software facilitating immersive control interface developments was conducted. These interfaces allow collecting datasets of human demonstrations for learning-based control methods. Methods combining higher-level reasoning interwoven with control have also been explored to optimise long-horizon manipulation tasks.

WP7 (Whole-body control) worked on the full-body dynamical model identification of the ABB mobile YuMi system, which is used in subsequent whole-body control. As part of this work, low-level torque control interfaces were developed, which enable control compliance, improving robustness to perception inaccuracies. Non-prehensile manipulation received much attention, enabling the manipulation of objects without having a controlling grasp on them. Finally, work on exploiting human demonstrations and transferring them onto the robot’s morphology has started.

WP8 (Safety and acceptability) aims to better understand the context-of-use, the users, and set the stage for conducting user studies in a safe and ethical manner. This involved interviews of hospital staff regarding their current work practices, how they envisioned their work in the future, and the risk evaluation of the Harmony use cases. The use of semantic free speech is being explored to give a “voice” to the robot. These efforts were integrated into the design of a novel robot for hospital assistance which will be used for further studies.
Harmony will develop robust, flexible and safe autonomous mobile manipulation technology for use in human-centred environments. We will bring together a broad spectrum of robotics research, blending the latest work in areas such as object-based perception and learning for manipulation (currently demonstrated at TRLs 1-2) with established frameworks such as feature-based SLAM (demonstrated at TRL 5). Furthermore, the industry and end-user expertise present in our consortium will enable us to create and demonstrate use-inspired mobile manipulation tools at TRLs 5-6 (i.e. technology validated and demonstrated in relevant environment) with a clear route to higher TRLs.

Specifically, Harmony is extending the state of the art by:
- Formalising object-based (as opposed to classical feature-based) world representations for robotic perception,
- Developing novel robot localisation and mapping algorithms that exploit object-based world representations,
- Providing new algorithms for coupling task scheduling and motion planning that provide adaptive, congestion-free plans in shared human spaces,
- Developing an immersive control interface that provides a natural and intuitive way for medical staff to deliver and verify new robotic manipulation capabilities at aided, semi-, and autonomous levels,
- Developing robust and compliant whole-body motion planning and control for interacting with unknown objects while in close proximity or direct collaboration with human co-workers,
- Delivering strict guarantees on aspects such as safety, accuracy and patient privacy during navigation, handover, carrying and co-working tasks.

Together, the capabilities that we develop throughout Harmony will lead to faster industry take-up of assistive robotic technologies in healthcare environments and beyond.
Harmony use case 2: Automation of bioassay sample flow
Harmony use case 1: Automation of just-in-time delivery