Periodic Reporting for period 2 - APRIL (multipurpose robotics for mAniPulation of defoRmable materIaLs in manufacturing processes)
Período documentado: 2021-06-01 hasta 2022-11-30
APRIL aims at providing a technological infrastructure and interoperable methods, tools, and services that will support multipurpose, and easy to repurpose, autonomous dexterous robots able to manipulate, assemble and process different soft, deformable and flexible materials in a production line environment; making possible industrial innovation, cobotics, growth of business value and capabilities in the manufacturing sector, especially in manufacturing SMEs. The following seven specific objectives support the overall strategic goal:
• To provide a scalable and beyond state of the art modular robot prototype with high dexterity for manipulation of flexible materials by Y3 of the project. This prototype will be scalable in functions and connected as a plug-in to an existing knowledge base in the cloud, achieving high-level reasoning capabilities.
• To improve robot Grasping and Manipulation (G&M) to move towards multipurposed approach, by acquiring different skills to manipulate at least 3 types of flexible materials (food, plastics, papers, etc.) and 5 different characteristics (texture, size, shape, weight, colour, material composition, etc.) of the flexible materials.
• To develop a Knowledge Reasoning Engine Module (KREM) to enable grasp planning in complex scenes.
• To foster a ground-breaking standardized perceptual system.
• To design a proactive safety preservation and ergonomic optimization approach.
• To test and validate APRIL prototype under the federated machine learning through six use case demonstrators in five countries.
• To underpin Business Model Innovation (BMI) for robotics, creating new paths for sustainability of actions.
In APRIL, the union of fine grasping provided by APRIL robots, data from sensors and computational vision technology; coupled with a set of modular and different middleware layers and interfaces, provide the perceptual and contextual information that allows robots to sense and understand the production environment. This, allows to successfully manipulate a wide range of soft objects, learn, plan and execute ergonomic motions, making human robot collaboration simpler and more efficient. A federated approach, connects robots to a cloud based knowledge base that will contain the needed information to support robots performing the different jobs.
The course of action within APRIL project is organized in three overlapping and agile cycles that comprises the development of all project processes towards accomplishing project objectives; which starts with a foundational phase that gathers all requirements and continuous with two iterative phases that implements incremental versions of the robotic based solutions (i.e. version α and β). The APRIL robotic based solutions will be validated and tested with 6 high impact-oriented demonstration use cases manipulating deformable objects of different types (e.g. paper, chicken breast, shoes’ insoles, viscoelastic textile materials, cables, etc.) and showing the real-value of the produced outputs.
The APRIL project modules have advanced in their development in the areas of manipulation of soft objects as well as the software that tracks and plans actions based on the ergonomic motions. In summary, main milestones and advances over the previous period comprise:
• The implementation and test of the Low-level Control Engine Module, finishing the Mia Hand ROS packages, while implementing the Physical Interface Module (PIM) and the Interactive and Sensory Interface Module (ISIM).
• Testing and adaptation of the hardware components (gripper) within the use case scenarios.
• Implementation of a new plan for the low level control of the architecture with a planning of releases to complete the low level architecture; initiating the following components: Collision Aware Inverse Kinematics, Online Trajectory Generation, the Non-Prehensile Learned Policies component, the Robot/Human Handover module and the Grasp pipeline (which includes a Grasp dictionary).
• Implementation of reinforcement learning (RL) framework and progressing in the development of the Perceptual Engine Module (PEM) enabling the 3D reconstruction in the different APRIL use cases.
• Advancing in the High-level Control Engine Module (HICEM) module including a core library, definition and implementation of the interfaces to interact with required modules, initial integration of the grasp dictionary, mapping of human and hand motions, and creation of the task planning modules for most of the APRIL use cases.
• Advances on the Social Interaction Manager (SIM) allowing the robotic system to understand the actors present in the work environment; advancing in the experiments with neuromorphic cameras and the gestures` module.
• An initial integration at high level for all the APRIL modules was performed to test communication among the modules and prepare for full integration during the next period.
APRIL also have work towards dissemination and communication aiming that its project results will impact important manipulative operations dealing with deformable objects, such as whole body product manipulation, shape changing or biomanipulation (e.g. food). APRIL aims at impacting society by effectively supporting collaborative approaches that overcome these labour intensive, repetitive and/or physically demanding work and empowering those workers. See progress on APRIL Project website ( www.aprilproject.eu )