Skip to main content
European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Smart Assembly Robot with Advanced FUNctionalities

Periodic Reporting for period 2 - SARAFun (Smart Assembly Robot with Advanced FUNctionalities)

Période du rapport: 2016-09-01 au 2018-02-28

The assembly application, although considered a promising robotic application, has been proven challenging to automate. Thus, even expensive, massively produced products, such as cell-phones and tablets, are still assembled manually under harsh conditions. During recent years, there have been several attempts of designing robots that are inherently safe and thus can work together with humans in mixed assembly lines. However, even in these setups, when a traditional programming approach requiring hard automation performed by experienced software engineers is employed, several months are still needed for accomplishing a new assembly setup. Such long integration times make the payback calculation for many potential customers challenging, whereas a desired dynamic production setting is prohibited. Hence additional research was needed, towards developing programming methods that will enable a complete integration of a new assembly task to be made in a much shorter period, e.g. a single day. The SARAFun project was formed to enable a non-expert user to integrate a new bi-manual assembly task on a robot in less than a day. This is accomplished by augmenting the robot with cutting edge sensory and cognitive abilities required to plan and execute an assembly task demonstrated by the instructor. In order for the SARAFun project to successfully reach its goals, several prerequisites were set in the form of major Scientific and Technological Objectives throughout the duration of the project:
Objective 1: To develop a bi-manual robot capable to learn the assembly of two parts by
human demonstration.
Objective 2: To develop a bi-manual robot that enables teaching of assembly with advanced physical
human-robot interaction.
Objective 3: To develop an integrated planning framework to plan grasps and optimize the finger
design for industrial grippers to facilitate the clamping and mating of parts.
Objective 4: To develop strategies to improve and maintain grasp stability for industrial grippers.
Objective 5: To transfer to the robot, knowledge about human sensorimotor performance during
Assembly.
Objective 6: To develop effective multi-modal control assembly strategies under uncertainties
(Advanced two-part assembly operations are considered, such as folding and insertion by deformation).
Objective 7: To validate SARAFun project results in real assembly scenarios.
During the 36 months of the SARAFun project, the consortium has produced significant progress in correspondence to the project’s overall goals and work plan. The work during the project started with the analysis of the user requirements and the definition of the use cases, as well as with the overall SARAFun system specification and design. Based on these results, the development efforts have been advanced in all key-areas of the SARAFun project including but not limited to the development of:
1. intuitive Human-Robot Interaction interface deployed as web application
2. key-frame extraction for automatically generating an assembly program for the robot based on visual feedback from human demonstrations
3. physical Human Robot Interaction (pHRI) control
4. online motion generation with self-collision/obstacle avoidance and manipulability optimisation for bi-manual robots
5. automatic grasp planning and finger design for 3D printed robot fingers
6. slippage detection based on tactile feedback
7. sensorless force estimation
8. contact graphs for establishing contact between the two assembly parts
9. contact evaluation based on force/torque measurements
10. learning robot trajectories using pHRI and Dynamic Movement Primitives (DMPs)
11. functionality for teaching assembly forces
12. robot controller addressing folding
13. robot controller addressing insertion by deformation assembly
These developments have been coupled with the corresponding technical developments of the SARAFun hardware infrastructure incorporating an ABB IRB 14000 (YuMi) robot equipped for the purposes of SARAFun with an RGBD camera, load cells on the robot fingers, and a force/torque sensor on the robot wrist. The integrated system’s advanced functionalities have been demonstrated in real-world industrial scenarios using the YuMi robotic platform. Moreover, experiments have been performed within SARAFun, examining how humans perform assembly tasks such as sliding or folding insertion. The developed SARAFun components will be exploited both individually by each partner by publishing results and/or claiming IPR rights, as well as in collaboration with ABB for possible commercialisation of certain components or subsystems.
Several horizontal activities for diffusing project objectives, concepts and achievements to key stakeholders, and the general public have also taken place, including publications in scientific journals, presentations in scientific conferences, organising workshops on collaborative robotics, designing posters and leaflets describing the project and its progress, and participating in fairs and exhibitions, demonstrating the SARAFun technology.
The SARAFun approach to teaching a robot the assembly of two parts by human demonstration is divided into three overall phases: (1) Task Definition (Teaching by demonstration), (2) Task Synthesis/Design, and (3) Task Training (Learning by doing). Substantial progress has been made in the implementation of the SARAFun system addressing all phases. The project shows particularly strong results related to;
- Efficient algorithms for combined tracking of the components to be assembled and the human hand during demonstration.Namely a hand-object detection and tracking framework has been developed addressing deformable objects.
- A novel approach for automatic identification of key events (key-frames) in the recorded frames of the 3D camera that does not only utilize 2D data but 3D information (from the tracking framework).
- Advanced pHRI methodology has been developed proposing a novel control scheme enabling pHRI under strict spatial limits.
- An online motion generation methodology for controlling the movement of a bimanual robot during execution.
- A framework that allows the human operator to adjust DMPs in an intuitive way.
- An automatic grasp planner that produces a rated set of grasps based on the geometry of the part and assembly constraints.
- A system for automatic generation of fingertips that can be 3D printed and creates a secure grasp of the part.
- Strategies to improve and maintain grasp stability for industrial grippers detecting slippage of parts during manipulation.
- A contact graph framework for contact establishment and evaluation during assembly.
- A sensorless force estimation framework.
- Analysis of human execution of assembly tasks.
- Two bimanual robot controllers for folding and insertion by deformation assemblies, respectively.
Apart from the individual impact of the above results, their interplay into the SARAFun integrated system demonstrating advanced perception and cognitive abilities presents new potential for enabling safe and efficient collaboration between human workers and robots in assembly lines. The SARAFun “new assembly in a single-day” paradigm is leading to drastically reduced lead time for setting up or reconfigure a production line/cell.
SARAFun Project Vision Figure