Current autonomous robots and interfaces are far from exhibiting the adaptability of biological beings regarding changes in their environment or during interaction. They are not always able to provide humans the best and a situation-specific support. Giving the robot or its interface insight into the human mind can open up new possibilities for the integration of human cognitive resources into robots and interfaces, i.e., into their intelligent control systems, and can particularly improve human-machine interaction. In this thesis embedded Brain Reading (eBR) is developed. It empowers a human-machine interface (HMI), which can be a robotic system, to infer the human’s intention and hence her/his upcoming interaction behavior based on the context of the interaction and the human’s brain state. To enable eBR, an automatic context recognition or generation as well as online, single-trial brain signal decoding, i.e., Brain Reading (BR) for the detection of specific brain states, are required. The human’s electroencephalogram (EEG) recorded from the head’s surface is used in this work as a measure of brain activity. Experiments are conducted in controlled experimental setups, where subjects have to execute differently complex and demanding simple and dual-task behavior as it is performed during human-machine interaction. Using these experiments the applicability and reliability of BR is confirmed as well as training procedures for BR are improved. Furthermore, a formal model for eBR is developed and shown to be applicable for different implementations of eBR. The formal model allows to prove implementations of eBR for their correctness and completeness. By means of robotic applications for tele-manipulation and rehabilitation it is further shown that eBR can be applied to either adapt or to drive HMIs, i.e., can be used to implement predictive HMIs for passive or active support. In case that eBR is applied for passive support, it is shown that malfunction of the whole system can be avoided. On the other hand, in case that eBR is applied for active support, i.e., to actively drive an HMI, it is shown that an individual adaptation of the support with respect to the requirements of different users can be facilitated by utilizing multimodal signal analysis in eBR. Finally, it is shown that even in case of passive support eBR can measurably improve human-machine interaction.
Embedded Brain Reading
VeranstaltungsortRaum A 1.03, Robert-Hooke-Str. 1 in Bremen
In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.