Periodic Reporting for period 2 - SARAFun (Smart Assembly Robot with Advanced FUNctionalities)
Período documentado: 2016-09-01 hasta 2018-02-28
Objective 1: To develop a bi-manual robot capable to learn the assembly of two parts by
human demonstration.
Objective 2: To develop a bi-manual robot that enables teaching of assembly with advanced physical
human-robot interaction.
Objective 3: To develop an integrated planning framework to plan grasps and optimize the finger
design for industrial grippers to facilitate the clamping and mating of parts.
Objective 4: To develop strategies to improve and maintain grasp stability for industrial grippers.
Objective 5: To transfer to the robot, knowledge about human sensorimotor performance during
Assembly.
Objective 6: To develop effective multi-modal control assembly strategies under uncertainties
(Advanced two-part assembly operations are considered, such as folding and insertion by deformation).
Objective 7: To validate SARAFun project results in real assembly scenarios.
1. intuitive Human-Robot Interaction interface deployed as web application
2. key-frame extraction for automatically generating an assembly program for the robot based on visual feedback from human demonstrations
3. physical Human Robot Interaction (pHRI) control
4. online motion generation with self-collision/obstacle avoidance and manipulability optimisation for bi-manual robots
5. automatic grasp planning and finger design for 3D printed robot fingers
6. slippage detection based on tactile feedback
7. sensorless force estimation
8. contact graphs for establishing contact between the two assembly parts
9. contact evaluation based on force/torque measurements
10. learning robot trajectories using pHRI and Dynamic Movement Primitives (DMPs)
11. functionality for teaching assembly forces
12. robot controller addressing folding
13. robot controller addressing insertion by deformation assembly
These developments have been coupled with the corresponding technical developments of the SARAFun hardware infrastructure incorporating an ABB IRB 14000 (YuMi) robot equipped for the purposes of SARAFun with an RGBD camera, load cells on the robot fingers, and a force/torque sensor on the robot wrist. The integrated system’s advanced functionalities have been demonstrated in real-world industrial scenarios using the YuMi robotic platform. Moreover, experiments have been performed within SARAFun, examining how humans perform assembly tasks such as sliding or folding insertion. The developed SARAFun components will be exploited both individually by each partner by publishing results and/or claiming IPR rights, as well as in collaboration with ABB for possible commercialisation of certain components or subsystems.
Several horizontal activities for diffusing project objectives, concepts and achievements to key stakeholders, and the general public have also taken place, including publications in scientific journals, presentations in scientific conferences, organising workshops on collaborative robotics, designing posters and leaflets describing the project and its progress, and participating in fairs and exhibitions, demonstrating the SARAFun technology.
- Efficient algorithms for combined tracking of the components to be assembled and the human hand during demonstration.Namely a hand-object detection and tracking framework has been developed addressing deformable objects.
- A novel approach for automatic identification of key events (key-frames) in the recorded frames of the 3D camera that does not only utilize 2D data but 3D information (from the tracking framework).
- Advanced pHRI methodology has been developed proposing a novel control scheme enabling pHRI under strict spatial limits.
- An online motion generation methodology for controlling the movement of a bimanual robot during execution.
- A framework that allows the human operator to adjust DMPs in an intuitive way.
- An automatic grasp planner that produces a rated set of grasps based on the geometry of the part and assembly constraints.
- A system for automatic generation of fingertips that can be 3D printed and creates a secure grasp of the part.
- Strategies to improve and maintain grasp stability for industrial grippers detecting slippage of parts during manipulation.
- A contact graph framework for contact establishment and evaluation during assembly.
- A sensorless force estimation framework.
- Analysis of human execution of assembly tasks.
- Two bimanual robot controllers for folding and insertion by deformation assemblies, respectively.
Apart from the individual impact of the above results, their interplay into the SARAFun integrated system demonstrating advanced perception and cognitive abilities presents new potential for enabling safe and efficient collaboration between human workers and robots in assembly lines. The SARAFun “new assembly in a single-day” paradigm is leading to drastically reduced lead time for setting up or reconfigure a production line/cell.