IMMI

Intelligent Man-Machine Interface - Adaptive Brain-reading for assistive robotics

IMMI in combination with an exoskeleton for rehabilitation or telemanipulation (Photo: DFKI GmbH)
IMMI in combination with an exoskeleton for rehabilitation or telemanipulation (Photo: DFKI GmbH)

The aim of project "IMMI" is the development of key technologies that allow adaptive real-time brain reading (BR). Essentially, BR in humans comprises the objectives of estimation of mental states and the prediction of behavior, based on analyses of brain activity.

Duration: 15.05.2010 till 30.04.2015
Partner: University of Bremen
Sponsor: Federal Ministry of Economics and Technology
German Aerospace Center e.V.
Grant number: This project is funded by the Space Agency (DLR Agentur), acting on a mandate from the Federal Government, grant no. 50 RA 1011 (University of Bremen) und 50 RA 1012 (DFKI).
Donee: DFKI GmbH & University of Bremen
Team: Team VII - Sustained Interaction & Learning
Application Field: Space Robotics
Related Projects: VI-Bot
Virtual Immersion for holistic feedback control of semi-autonomous robots (01.2008- 12.2010)
Labyrinth 1
Development of Learning Architectures and Experiments in Sensory Motor Learning (06.2007- 12.2007)
Labyrinth 2
Setup as testbed for learning architectures and EEG/ fMRI- analysis (01.2008- 01.2009)
Related Software: MARS
Machina Arte Robotum Simulans
pySPACE
Signal Processing and Classification Environment written in Python
reSPACE
Reconfigurable Signal Processing and Classification Environment

Project details

Operator makes use of IMMI to control a multi robot scenario (Photo: DFKI GmbH)
A mobile device is applied for signal analysis (Source: DFKI GmbH)

In contrary to ordinary Brain Computer Interface (BCI) approaches, Brain Reading (BR) is based on the the concept of observing an operator without distracting him from his actual task. Hence, the focus lies on a passive supervision of the operator rather than on an active steering of robots or protheses through thoughts. The BR System reads the brain by detecting and interpreting specific changes in the brain waves. These variations can, e.g., give insight into the processing of information that is presented to the operator. The control system can make use of this knowledge in order to prepare proactive, situation-specific actions. Thus, telemanipulation tasks gain in effectiveness and intuition for the operator. On the other hand, a misinterpretation of the operator's intentions can never jeopardize the overall scenario, as no systems are controlled directly.

Brain reading systems are considered adaptive (aBR) if they are able to compensate autonomously varying environmental conditions, different users and scenarios. The development of real-time aBR systems would allow maximum flexibility for all kinds of applications and can hence be seen as a new generation of man-machine interfaces. Applications for such interfaces related to space missions include the administration of semi-autonomous exploration missions, maintenance tasks on space stations, and manipulation jobs in general, such as, e.g., setting up and conducting experiments inside and outside the ISS.

Mobile brain reading system (MBRS)

In order to analyze the brain activity, a variety of complex mathematical procedures have to be applied. Therefore, large computing power is necessary, if the informations obtained from the brain activity should be available in real-time. At the same time, it is essential that the operator is not limited in his freedom of movement - accordingly, large stationary computing systems cannot be used for this application case.

Hence a mobile brain reading system (MBRS) is developed in IMMI. The MBRS combines a general purpose CPU with an FPGA. Application specific hardware accelerators, which are implemented in the FPGA, process the EEG data in real time. By using the hardware accelerators, more computing power for this task can be achieved in comparison to the usage of a normal CPU.

Framework reSPACE (reconfigurable Signal Processing And Classification Environment)

In order to implement these hardware accelerators, a new framework called reSPACE (reconfigurable Signal Processing And Classification Environment) is developed in IMMI. reSPACE uses a simple model-based development process to provide facilities for the rapid implementation of a wide range of different hardware accelerators. The hardware accelerators are based on the dataflow computing paradigm and can process the data independently of the CPU.

The realization of the project IMMI requires an intensive cooperation of researchers from the fields of neuroscience, computer science, mathematics, physics, and electrical engineering. Project IMMI is divided into five work packages which can be summarized as follows:
WP1000: Project management;
WP2000: Neuroscientific methods and investigations on state prediction in humans;
WP3000: Analyses of EEG data by means of machine-learning approaches;
WP4000: Development of a mobile system BR-System;
WP5000: Integration and test of the BR-mobile system in an application scenario.

For the Analyses of EEG data by means of machine-learning approaches (WP3000), new processing methods are developed and compared to existing algorithms. Thereby, additional hyperparameters of these methods have to be optimized and several datasets have to be processed. For simplifying the processing and enabling parallelization, the open-source software pySPACE is developed further. With this software, specifications of complex processing chains can be exchanged between the scientists and it enables a distributed processing on a high-performance-cluster.

Videos

Adaptation of a man-machine interface for multi-robot control with respect to task engagement using embedded Brain Reading

The video shows a man-machine interface for multi robot control that is adapted online with respect to task load and task engagement to improve user support and efficiency of interaction by means of embedded Brain 
 Reading (eBR). The level of task engagement is inferred from the single-trial detectability of 
P300-related brain activity with 
the need for a secondary task to measure task load.

Exoskeleton control via biosignals

Demonstration of the Capio exoskeleton control via biosignals: The intended movement of the human operator is detected by the biosignal data processing which triggers the execution of the targeted movement by the exoskeleton. By means of an eye tracker the desired interaction is detected (focusing on a virtual bottle) and by electroencephalographic signals (EEG), the intended movement and the performing limb are determined. Furthermore, by means of electromyographic signals (EMG), the intended movements are verified.

Outlook: Future Brain Reading Scenario
Outlook: Future Brain Reading Scenario (Photo: DFKI GmbH)
Back to the list of projects
© DFKI GmbH
last updated 25.08.2016
to top