EXPECT

Exploring the Potential of Pervasive Embedded Brain Reading in Human Robot Collaborations

The main goal of the EXPECT project is the development of an adaptive, self-learning interaction platform for human-robot collaboration, which not only enables explicit multimodal interaction, but is also able to derive human intentions that can be used for implicit interaction or to optimize explicit interaction. Methods for automated labelling and joint analysis of multimodal human data are developed and evaluated in test scenarios. The systematic experiments also serve to investigate how fundamentally important brain data are for predicting human intentions. Results are presented in application-oriented demonstration scenarios.

Duration: 01.06.2020 till 31.05.2024
Donee: German Research Center for Artificial Intelligence GmbH
Sponsor: Federal Ministry of Education and Research
Application Field: Assistance- and Rehabilitation Systems
Logistics, Production and Consumer
Space Robotics
Related Projects: TransFIT
Flexible Interaction for infrastructures establishment by means of teleoperation and direct collaboration; transfer into industry 4.0 (07.2017- 12.2021)
BesMan
Behaviors for Mobile Manipulation (05.2012- 07.2016)
IMMI
Intelligent Man-Machine Interface - Adaptive Brain-reading for assistive robotics (05.2010- 04.2015)
Recupera REHA
Full-body exoskeleton for upper body robotic assistance (09.2014- 12.2017)
TransTerrA
Semi-autonomous cooperative exploration of planetary surfaces including the installation of a logistic chain as well as consideration of the terrestrial applicability of individual aspects (05.2013- 12.2017)
Related Robots: Full Body Exoskeleton
Exoskeleton for upper body robotic assistance
Related Software: pySPACE
Signal Processing and Classification Environment written in Python
reSPACE
Reconfigurable Signal Processing and Classification Environment
BOLeRo
Behavior Optimization and Learning for Robots

Project details

image 1: Multimodal input from a human in a real world HRC scenario will be transferred into explicit and implicit interaction concepts for a robot by machine learning algorithms. (image: Maurice Rekrut, DFKI)
image 2: Application Scenario Rehabilitation: Serious Gaming for Rehabilitation after Stroke (DFKI)
image 3: Space application scenario: Multimodal human-robot collaboration for joint infrastructure development (DFKI)

The need for integration of robotic systems into the working environment in various industries will grow steadily over the next few years, making cooperation between humans and robots inevitable. The proposed EXPECT project will provide basic requirements and solutions for the integration of brain activity to improve human-robot interaction in real working environments and thus have a direct positive impact on the increasing collaboration between humans and robots in the working environment. The developed technology is intended to make this cooperation between man and machine more natural and will increase the acceptance of the human cooperation partner towards his robotic colleague as well as the productivity of the collaborating team.

In EXPECT, a combination of multimodal input, e.g. biosignals including brain activity and contextual information from speech and behaviour, will be developed for two reasons to find a correlation between EEG data and openly observable behaviour:

1. to create context from openly observable behavior and thus automatically label EEG data into training sets to implement models for intention recognition.

2. to evaluate whether pure openly observable data are sufficient to realize intention recognition based on these automatically generated models of EEG data.
In summary, the main goal of the EXPECT project is to create an adaptive, self-learning interaction platform for MRI, which not only enables explicit multimodal interaction, but is also able to derive human intentions, which are used for implicit interaction or for the optimization of explicit interaction (see figure 1).

EXPECT will contribute to the scientific community by:
Leveraging brain signal analysis in real world applications to enhance usability of brain activity under less
restricted conditions, i.e. industrial settings, aerospace or rehabilitation (see Section 4).
Designing guidelines for brain signal supported HRC for future integration in those real-world application
scenarios. Gaining deeper knowledge on the contribution of multimodal human data for intention recognition to learn about the exchangeability of EEG data by overt behavioral data.
Integrating EEG data into multimodal dialog platforms as input for explicit and implicit interaction.
To develop and to test the approaches investigated in EXPECT in test scenarios as well as application close scenarios the Cognitive Assistants (COS) group at DFKI in Saarbrücken and the Robotics Innovation Center (RIC) in Bremen have a unique starting point for the research on brain signals and robotic systems, since they have easy access to existing living labs, test facilities and robots in its immediate neighborhood:

(1) With the HRC-laboratory MRK4.0, DFKI has an industry-oriented test field with over 20 robots of different brands and with different capabilities, form-factors, and operating systems. Six main demonstrators illustrate different concepts of HRC with a focus on natural interaction and security in this field. The lab is located in a former factory building in Saarbrücken and is part of the Power4Production research center, which is a collaborative effort of DFKI and the Center for Mechatronics and Automation (ZeMA).

(2) DFKI RIC focusses on the design, construction, and programming of autonomous robots for many application fields, such as space, maritim, production, search and rescue, and health. Although focussing on autonomous behavior, interaction with human is always required. For natural HRI machine learning approaches are applied. These new and innovative approaches are developed by the team "interactive machine learning" (https://robotik.dfki-bremen.de/de/forschung/teams/interactive-machine-learning.html). To record psychophysiological data from humans the well equipped research facility "brain behavioral integration rooms" provides a EEG shielding cabin, a mini Cave installation for virtual scenarios, a specialized analysis room and the Recupera integration room with different exoskeleton systems for teleoperation and motion recording. (https://robotik.dfki-bremen.de/en/research/research-facilities/brain-behavioral-integration-rooms.html)

The results of the projects are to be finally demonstrated in the application-oriented demonstration scenarios Industry 4.0, rehabilitation (image 2) and space (image 3).

Publications

2022

Cross-lingual Voice Activity Detection for Human-Robot Interaction
Nils Höfling, Su-Kyoung Kim, Elsa Andrea Kirchner
In The Third Neuroadaptive Technology Conference, (NAT-2022), 09.10.-12.10.2022, Lübbenau, n.n., pages 100-103, Nov/2022.
Weight perception in exoskeletonsupported teleoperation
Mareike Förster, Su-Kyoung Kim, Michael Maurus, Shivesh Kumar, Bigna Lenggenhager, Elsa Andrea Kirchner
In The Third Neuroadaptive Technology Conference, (NAT-2022), 09.10.-12.10.2022, Lübbenau, n.n., pages 59-61, Oct/2022.
Bidirectional and Coadaptive Robotic Exoskeletons for Neuromotor Rehabilitation and Assisted Daily Living: a Review
Elsa Andrea Kirchner, Judith Bütefür
In Current Robotics Reports, Springer Nature, volume N./A., pages 2662-4087, Apr/2022.

Back to the list of projects
© DFKI GmbH
last updated 04.01.2024
to top