Embedded brain reading enables better human-robot interaction

Technology News |
By Christoph Hammerschmidt

Scientists at the Robotics Innovation Center at the German Research Center for Artificial Intelligence (DFKI) and the Robotics Research Group at the University of Bremen are investigating how robots can be controlled by thought impulses. Together they develop key technologies that enable real-time and adaptive embedded brain reading. This not only makes robots intuitively and effectively controllable on the basis of human brain activities. At the same time, the systems can interpret human thoughts and learn from them.

Robots can be controlled by human brain activity via brain-computer interfaces (BCIs). This involves electroencephalography (EEG), in which electrodes applied to the head measure potential changes in the brain. In contrast to classical BCIs, the holistic approach “embedded brain reading” (eBR) developed by the researchers of the current project goes one step further: brain activity can not only be measured, but also interpreted. In this way, intentions for action and the cognitive workload of persons can be identified. eBR relies exclusively on the passive observation of natural brain activity and thus avoids an additional burden on humans through the use of a BCI.

In addition, the innovative approach, which in addition to the EEG also relies on electromyography to measure muscle activity, eye tracking and motion analysis, enables complete and fault-tolerant integration of brain activity into the control of technical systems. Applications of this technology could be, for example, the teleoperation of space robots, but also the EEG-based control of robotic exoskeletons.

In embedded brain reading, event-related potentials (ERPs) in the EEG serve as input sources that arise in response to an internal change in state or an external stimulus. At DFKI, these potentials are used to improve the interaction between humans and robots. The scientists Dr. Elsa Andrea Kirchner and Dr. Su Kyoung Kim investigated how ERPs can be detected by single-trial detection in the EEG and what influence different training modalities have. It was shown that ERPs can also be successfully detected under “dual-task” conditions by single-trial detection, i.e. when humans are engaged in not only one but multiple activities. The rarer and more important the stimulus caused by the task, the higher the detection performance. Single-trial ERP recognition is particularly suitable for real-time online EEG analysis, for example for controlling an exoskeleton. In the context of rehabilitative therapy, ERP recognition can provide information not only about planned movements, but also, for example, about a patient’s attention state.

Robots learn from mistakes thanks to human negative feedback

Another paper of the team is focusing on the fruitful use of the so-called error-related potential. How this can be used for human-robot interaction is the subject of the paper “Intrinsic interactive reinforcement learning – Using error-related potentials for real world human-robot interaction” published in the journal Scientific Reports – Nature. The scientists of the Robotics Innovation Center and the University of Bremen describe a method of machine learning developed at the DFKI in which a robot can learn from its own misbehaviour in gesture-controlled interaction with humans. At the same time, the robot is able to learn to distinguish the gestures of humans and to assign them to the actions they can perform.

Whether this assignment was correct or incorrect, he learns from the EEG measurement in humans, through which he receives negative feedback, the error-correlated potential, in the event of a faulty action. This relieves the human in the interaction, since he does not have to give the feedback to the robot consciously, but thanks to eBR it is already picked up on the subconscious level. The Bremen researchers were able to apply the method, which is based on intrinsic feedback, for the first time in interaction with a real robot system and show that it leads to an improvement in the interaction between humans and robots. In rehabilitation with exoskeletons, the error potential could be used, for example, to gain insights into user acceptance.

However, the use of physiological data to improve functionality and user-friendliness in technical rehabilitation systems is linked to the possibility of their processing. This must be done in real time in order to support the movements as naturally as possible. In addition, mobile and miniaturized processing systems are needed that can be embedded in the rehabilitation facility. In their work the scientists of the DFKI and the University of Bremen – Dr. Hendrik Wöhrle, Marc Tabie, Dr. Su Kyoung Kim, Prof. Frank Kirchner and Dr. Elsa Andrea Kirchner – developed a compact brain-reading system for real-time motion prediction. They rely on Field Programmable Gate Arrays (FPGAs), reprogrammable circuits that enable parallel processing operations and can therefore process large amounts of data in the shortest possible time.

The researchers also developed the software framework reSPACE in order to make it usable for robotics. This defines the various application-specific computing operations that are combined into a data flow accelerator according to the modular principle and made available on the FPGA. By real-time evaluation of EEG data, the developed system can e.g. support the control of an exoskeleton. The FPGAs manage the huge amount of data within a few nanoseconds – only in this way can the exoskeleton support arm movement at exactly the right moment.

At the forthcoming CeBIT Expo from 12 to 15 June 2018 in Hanover, the DFKI will present a mobile exoskeleton based on this research work, which was developed for stroke patients. This exoskeleton can be controlled on the basis of EEG data. The findings and technologies for embedded brain reading were applied.

Further Information:

Project Recupera REHA
Software-Framework reSPACE

DFKI auf der CEBIT


Linked Articles
eeNews Europe