Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Mobile dual arm robotic workers with embedded cognition for hybrid and dynamically reconfigurable manufacturing systems

Article Category

Article available in the following languages:

Why your next work colleague might be a robot

By leveraging AI and advanced automation, a team of EU research and industrial partners has developed a dual arm configured robot that can move freely around a factory floor, perform complex tasks, and even interact with its human co-workers.

Factories have long been defined by the assembly line model. But with manufacturing becoming increasingly customer-centric, there is a growing need for a more flexible factory. For the EU-funded project THOMAS (Mobile dual arm robotic workers with embedded cognition for hybrid and dynamically reconfigurable manufacturing systems), the answer is smarter machines. “Cognition-embedded robots allow factories to quickly adapt to varying customer demands while minimising the costs of maintaining large inventories,” says Niki Kousi, a research engineer at the University of Patras Laboratory for Manufacturing Systems and Automation (LMS) in Greece, who coordinated the project. “By taking on many of the strenuous, repetitive and dangerous tasks, intelligent robots can also improve the health, safety and well-being of human workers,” adds Sotiris Makris, head of the Robots, Automation and Virtual Reality in Manufacturing group at LMS.

A reconfigurable factory

With the goal of creating reconfigurable factories based on autonomous, mobile robot workers, the project started by creating an innovative mobile dual arm robot. “Thanks to their dual arm configuration and ability to move freely around a shop floor, these robots are capable of performing advanced tasks, thus creating a new production paradigm,” notes Makris. Next, researchers digitalised the factory floor. “As a digital twin of the physical factory, these models contain everything from human operators to robots, parts and processes,” explains Kousi. “The model is also dynamic and uses 2D and 3D sensor data to provide a real-time capture of the shop floor status.” These models serve as a roadmap for the robots, allowing them to autonomously – and safely – navigate the shop floor and perform multiple operations, including screwing, handling and drilling. It also allows them to assist and interact with their human counterparts. “Capable of advanced reasoning, the robots can cooperate with each other and other production resources – including human operators,” says Makris. “Each robot is equipped with certified sensing devices that allow it to safely move and interact with humans within a fenceless environment.” Behind these advanced capabilities is an innovative technology called robot perception libraries. These libraries allow the robots to navigate without colliding, properly align themselves with and virtually dock at different workstations, and detect the positioning of the various tools used during the assembly process. All the project’s technologies have been integrated into the THOMAS Open Production Station (OPS).

Robots will assist – not replace

While some could view THOMAS as another example of jobs being lost to automatisation, Makris is adamant that this is not the case. “THOMAS is designed to act as an assistant to its human operators, taking over the most dangerous and strenuous tasks,” he adds. “Not only will this increase the safety of our factories, it will allow humans to focus their attention on the most advanced, high-level tasks – including robotic programming.” To illustrate this, Makris notes that the THOMAS OPS was fully demonstrated in real-life settings, including an automotive and aeronautics factory. “The results show that the THOMAS OPS not only improves the human operators’ utilisation, but also opens up new job opportunities in a factory,” he concludes.

Keywords

THOMAS, robot, AI, automation, factory, manufacturing, digital twin, sensor data

Discover other articles in the same domain of application