My teacher is a robot
EU-funded scientists working on the EASEL project have developed autonomous robots that can carry out teaching tasks. ‘If society is not willing to invest in putting enough humans in front of classrooms to get a high level of individualised teaching that caters to the different ways a child learns, the only answer is technology,’ says project coordinator Paul Verschure, professor of cognitive science and neurorobotics at Pompeu Fabra University in Barcelona, Spain. The team of almost 20 researchers developed ‘an integrated control system for a robot-based tutoring system and we can now deploy it in classrooms,’ says Professor Verschure. The robot can read, respond to the behaviour and emotional state of the learner and adapt its responses. ‘This is unique because the robot is autonomous. It learns from the learner,’ he says. Communicating like a teacher The challenge of the three-year project was not just to build a robot that children could accept as a teaching assistant. ‘The robot has to tune its communication so the learner can deal with it, only then can it exchange the knowledge that you build into the teaching system,’ says Professor Verschure, adding there is little point in a robot giving lessons if it does not engage its pupils. But what sets the EASEL robot apart is that its teaching system is grounded in scientific theories of the mind and brain, and work carried out by the project on the principles of how children learn. ‘It’s not just putting a robot in the classroom and seeing what happens. There is very little operational knowledge in this field of pedagogy that you can just translate into a robot,’ explains Professor Verschure, ‘if you don’t understand how children learn and their individual variability, technology is not going to solve the problem.’ Primary school experiments The project carried out robot-teaching experiments in six primary schools in the Netherlands, Spain and the UK involving some 200 8–9 year-olds who were learning the physics of the balance beam. The pupils used a physical balance beam or a tablet using virtual and augmented reality. The robot acted as a coach, instructing the pupil to do certain things. ‘We developed validated virtual reality and augmented reality protocols to provide additional teaching content — to explain, to question and to be a tool the robot can use to teach,’ Professor Verschure explains. The project devised what professor Verschure describes as a new type of mechatronic educational tool — an instrumented balance beam as part of an integrated architecture that measures what a child does with it, so the robot can give precise feedback. Learning impact was carefully analysed, measuring levels of communication, knowledge gained or lost, individual pupil variability and, crucially, pupils’ confidence in their own learning. The latter was key, and more important for learning than the researchers had originally anticipated, Professor Verschure says. In other experiments at science museums in Barcelona and Sheffield, UK, the robot instructed 14–15 year olds in fitness exercises, explaining the amount of energy used. ‘The idea was to convey core notions of healthy living such as exercise and its effects on the body,’ says Professor Verschure. The challenge now is to generalise the results, currently limited to specific tasks with specific age groups, in order to devise autonomous robots that can tackle more areas of teaching.
Keywords
EASEL, robots, machine learning, artificial intelligence, pedagogy, teaching, virtual reality, augmented reality