VI-Bot

Virtual Immersion for holistic feedback control of semi-autonomous robots

VI-Bot integrates approaches from the areas of robotics, neurosciences and human-machine interaction into an innovative system designed for remote control of robotic systems. A novel exoskeleton with integrated passive safety, using adaptive and behaviour-predicting operator monitoring by means of online EEG analysis, and comprehensive virtual immersion and situational presentation of information and operational options, will convey an “on site-feeling” to the telemanipulating operator.

Duration: 01.01.2008 till 31.12.2010
Donee: German Research Center for Artificial Intelligence GmbH
Sponsor: Federal Ministry of Education and Research
Grant number: Funded by the Federal Ministry for Education and Science, Grant No. 01IW07003
Application Field: Underwater Robotics
Assistance- and Rehabilitation Systems
Related Projects: CManipulator
An Autonomous Dual Manipulator for Deep-Sea Inspection and Maintenance (09.2006- 09.2009)
Labyrinth 1
Development of Learning Architectures and Experiments in Sensory Motor Learning (06.2007- 12.2007)
Labyrinth 2
Setup as testbed for learning architectures and EEG/ fMRI- analysis (01.2008- 01.2009)
Related Robots: Dual Arm Exoskeleton
Exoskeleton for upper body robotic assistance (Recupera REHA)
Full Body Exoskeleton
Exoskeleton for upper body robotic assistance
BRIO Labyrinth
Testbed for the development of learning architectures
Exoskeleton Passive (VI-Bot)
Upper body exoskeleton (right arm) for motion capturing
Exoskeleton Active (VI-Bot)
Upper body Exoskeleton (right arm)
Related Software: pySPACE
Signal Processing and Classification Environment written in Python
CAD-2-SIM
Computer Aided Design To Simulation

Project details

Overview of the interaction of the components in the VI-Bot project. On display is the interaction between the exoskeleton, visualization and aBRI and the two demonstration scenarios (Source: DFKI GmbH)
Concept of the mapping system: operator-transparent adaption of the exoskeleton's degrees of freedom to the movement control of the target system. (Source: DFKI GmbH)

The complexity of both mobile robotic systems and their fields of application is continuously increasing, and slowly reaching a point where direct user control or control by state-of-the-art AI is no longer economical. The VI-Bot aims to enable an individual user to control such a complex robotic system. A safe exoskeleton, an adaptive user observation, and a robust multi-modal user interface will be acting together thus giving the user of a semi-autonomous robotic system the impression that he is directly on site. By means of this virtual immersion, remote control of robotic systems will attain the next level and make it possible to virtually dissolve the separation between robot and user and, as a consequence, to bring together man’s cognitive abilities and the robotic system’s robustness. The efficiency of this approach will be evaluated by means of a complex manipulation task.

Experiences with current tele-operation environments have demonstrated that both perceptive and motor strains on operators are very high. As a result, operators often do not register warnings of the system in time, and the error rate increases with growing length of operation. This is why the aspired mutual control between operator and VI-Bot interface should be direct, dependable, fast, and extremely coordinated. The use of an adaptive “brain reading” interface (aBRI) still to be developed will enable the VI-Bot interface both to determine whether the operator has noticed the presented warning and to predict the operator’s actions in order to prepare the system accordingly.

VI-Bot is the first project of its type which integrates approaches from the areas of robotics, neurosciences, and human-machine interaction into a complete system and thus takes on the challenge of applying as yet mostly theoretical approaches to extremely realistic and application-oriented scenarios. The elements which stand out in particular in this regard are:

  • A novel exoskeleton, based on intelligent joint modules with integrated passive safety.
  • A novel adaptive and behaviour-predicting operator monitoring by means of online EEG analysis.
  • Comprehensive virtual immersion and situational presentation of information and operational options.

Videos

VI-Bot: Virtual immersion

Virtual Immersion for holistic feedback control of semi-autonomous robots

VI-Bot: Active Exoskeleton 2010

 

Teleoperation with the Active Exoskeleton

VI-Bot: Final active exoskeleton

Teleoperation with the final Active Exoskeleton

VI-Bot: Passive exoskeleton

Teleoperation with the Passive Exoskeleton.

Exoskeleton and Tele-operation Control

The project VI-Bot takes into account and evolves the current state of the art of haptic interfaces both from the hardware and software point of view. The haptic device, namely the Exoskeleton and its control system, thus allow a complex interaction with the user, who is enabled perform a teleoperation task with a target robotic system.The Exoskeleton is designed on the base of the human arm anatomy. The device is thought to be wearable, lightweight, and adaptable to different user sizes. Its kinematic structure is configured to constrain the movements of the user as less as possible, while offering an high level of comfort within the overall arm workspace.

The control strategy, based on a combination of classical and bio-inspired techniques, allows a better harmonization with the human arm’s nervous system and additionally implements different safety mechanisms. The development of a general position/force mapping algorithm provides intuitive and effective teleoperation of any complex robotic system of any given morphology.

Direct link to the robotsystems:
Active Exoskeleton
Passive Exoskeleton

Here you can read more information about Exoskeleton and Tele-operation Control.

aBRI

The aBR Interface (aBRI) is part of the VI-Bot system. It is a highly integrated control environment that monitors operator brain signals in real time. This allows anticipation of impending movements and ensures that alert signals have been consciously processed by the operator. It thus is a vital component of the VI-Bot system that interfaces with the exoskeleton and the virtual immersion subsystems, thereby extending the domain of man-machine interaction to the realm of thought processes.

Recently we set up an online BR system that is able to detect certain EEG activity online and hereby allows to predict whether environment alerts have been processes. At this point in the project our work focuses on the anticipation of impending movements and on the adaptiveness of applied methods.

Here you can read more about the aBR Interface.

Demonstration Scenario

As demonstration and evaluation scenario a typical tele-operation environment will be developed, which includes several typical elements of remote operation. The target plattform wil be an dexterous Mitsubishi PA-10 industrial robot. This robot wil be tele-operated via the exoskeleton. The robot is mounted in a laboratory and the operator has to fullfill the tasks grabbing a tool, follow a 3D contour with the rool and releasing the tool at the end. The operator will do thisusing the exoskeleton to interact with the robot an the user interface while wearing the VR goggles displaying a camera feed and a 3D modell of the robot and he will be equipped with the EEG system connected to the aBRI system to monitor her reactions. During the complete event the operator has to watch out for optical warnings displayed in the VR goggles and has to confirm them.

This scenario uses all core components of VI-Bot. The operator controls the robot inuitive via the exoskeleton and has a in-situ feeling via the VR goggles. She is monitored by the aBRI system and therefore it can be checked wether she perceived warning or ignored them on purpose because she is in a critical tele-operation phase.Additionally the haptic feedback of the exoskeleton is supported by the aBRO allowing for smoother and more natural haptic feedback. All VI-Bot components can be switched of individually, which allows for a thoroughly evaluation of the systems performance by comparing it to the non-enhanced state.

The demonstration scenario will be implemented in October 2010 and test and evalution done in the last quarter of the project.

Publications

2019

Transfer approach for the detection of missed task-relevant events in P300-based brain-computer interfaces
Elsa Andrea Kirchner, Su-Kyoung Kim
In Proceedings in the 9th International IEEE EMBS Conference On Neural Engineering (NER’19), (NER-2019), 20.3.-23.3.2019, San Francisco, CA, IEEE Xplore, pages 134-138, 2019.
Embedded Multimodal Interfaces in Robotics: Applications, Future Trends, and Societal Implications
Elsa Andrea Kirchner, Stephen Fairclough, Frank Kirchner
Editors: S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos, A. Krueger
In The Handbook of Multimodal-Multisensor Interfaces, Morgan & Claypool Publishers, volume 3, chapter 13, pages 523-576, 2019. ISBN: e-book: 978-1-97000-173-0, hardcover: 978-1-97000-175-4, paperback: 978-1-97000-172-3, ePub: 978-1-97000-174-7.

2018

Multi-tasking and Choice of Training Data Influencing Parietal ERP Expression and Single-trial Detection - Relevance for Neuroscience and Clinical Applications
Elsa Andrea Kirchner, Su-Kyoung Kim
Editors: Mikhail Lebedev
In Frontiers in Neuroscience, n.a., volume 12, pages n.a., Mar/2018.

2014

Online Detection of P300 related Target Recognition Processes During a Demanding Teleoperation Task
Hendrik Wöhrle, Elsa Andrea Kirchner
In Proceedings of the International Conference on Physiological Computing Systems, (PHYCS-14), 07.1.-09.1.2014, Lissabon, Scitepress Digital Library, Jan/2014.

2013

Online Movement Prediction in a Robotic Application Scenario
Anett Seeland, Hendrik Wöhrle, Sirko Straube, Elsa Andrea Kirchner
In Proceedings of the 6th International IEEE EMBS Conference on Neural Engineering, (NER-2013), 06.11.-08.11.2013, San Diego, CA, o.A., pages 41-44, Nov/2013.

2012

Sheth-Uicker Convention Revisited -- A Normal Form for Specifying Mechanisms
Bertold Bongardt
series DFKI Research Reports, volume 12-01, Jul/2012. Robotics Innovation Center Bremen.
Measuring the Improvement of the Interaction Comfort of a Wearable Exoskeleton
Michele Folgheraiter, Mathias Jordan, Sirko Straube, Anett Seeland, Su-Kyoung Kim, Elsa Andrea Kirchner
In International Journal of Social Robotics, Springer Netherlands, volume 4, number 3, pages 285-302, Mar/2012.

2011

CAD-2-SIM - Kinematic Modeling of Mechanisms Based on the Sheth-Uicker Convention
Bertold Bongardt
Editors: Honghai Liu, Sabina Jeschke, Daniel Schilberg
In Proceedings of the 4th International Conference, (ICIRA-11), 06.12.-08.12.2011, Aachen, Springer, volume Part I, pages 465-477, Dec/2011. ISBN: 978-3-642-25485-7.
Choosing an Appropriate Performance Measure: Classification of EEG-Data with Varying Class Distribution
Sirko Straube, Jan Hendrik Metzen, Anett Seeland, Mario Michael Krell, Elsa Andrea Kirchner
In Proceedings of the 41st Meeting of the Society for Neuroscience 2011, (Neuroscience-2011), 12.11.-16.11.2011, Washington, DC, o.A., Nov/2011.
Minimizing Calibration Time for Brain Reading
Jan Hendrik Metzen, Su-Kyoung Kim, Elsa Andrea Kirchner
In Proceedings of the 33rd Annual Symposium of the German Association for Pattern Recognition, (DAGM-11), 30.8.-02.9.2011, Frankfurt / Main, o.A., pages 366-375, Sep/2011. ISBN: 978-3-642-23122-3.
On Transferring Spatial Filters in a Brain Reading Scenario
Jan Hendrik Metzen, Su-Kyoung Kim, Timo Duchrow, Elsa Andrea Kirchner, Frank Kirchner
In Proceedings of the 2011 IEEE Workshop on Statistical Signal Processing, (SSP-2011), 28.6.-30.6.2011, Nice, o.A., pages 797-800, Jun/2011. ISBN: 978-1-4577-0569-4.
A multimodal brain-arm interface for operation of complex robotic systems and upper limb motor recovery
Michele Folgheraiter, Elsa Andrea Kirchner, Anett Seeland, Su-Kyoung Kim, Mathias Jordan, Hendrik Wöhrle, Bertold Bongardt, Steffen Schmidt, Jan Albiez, Frank Kirchner
In Proceedings of the International Conference on Biomedical Electronics and Devices, (BIODEVICES-11), 26.1.-29.1.2011, Rome, o.A., pages 150-162, Jan/2011. ISBN: 978-989-8425-37-9.

2010

Towards Operator Monitoring via Brain Reading - An EEG-based Approach for Space Applications
Elsa Andrea Kirchner, Hendrik Wöhrle, Constantin Bergatt, Su-Kyoung Kim, Jan Hendrik Metzen, David Feess, Frank Kirchner
In Proceedings of the 10th International Symposium on Artificial Intelligence, Robotics and Automation in Space, (iSAIRAS-10), 29.8.-01.9.2010, Sapparo, o.A., pages 448-455, Sep/2010.

2009

Assisting Telemanipulation Operators via Real-Time Brain Reading
Elsa Andrea Kirchner, Jan Hendrik Metzen, Timo Duchrow, Su-Kyoung Kim, Frank Kirchner
Editors: Volker Lohweg, Oliver Niggemann
In KI 2009 Workshop, (KI-2009), 15.9.2009, Paderborn, o.A., series Lemgoer Schriftenreihe zur industriellen Informationstechnik, Sep/2009. ISBN: 1869-2087.

Back to the list of projects
© DFKI GmbH
last updated 04.01.2024
to top