Semi-autonomous cooperative exploration of planetary surfaces including the installation of a logistic chain as well as consideration of the terrestrial applicability of individual aspects
FiledownloadsProject Flyer English
Project Flyer German
Robotic systems that are able to work autonomously on alien planets or moons are equally well suited for applications on earth. Examples are the management of maritime resources, search and rescue, or medical rehabilitation. The goal of the project TransTerrA is to further develop the space technologies available at DFKI within a complex scenario and to make them available for terrestrial applications.
Scenario: a team of robots explores the lunar surface
Complex robotic missions attain an increasing importance for the exploration of our solar system. Ever more sophisticated experiments, the retrieval of samples or even the preparation of manned missions to alien bodies such as the moon or Mars cannot be achieved by individual systems any more, but need to be distributed to several missions. The scenario in TransTerrA demonstrates the (semi-) autonomous exploration of planetary surfaces using a cooperating robot team consisting of a rover and a shuttle. The shuttle‘s task is to supply the rover which requires the installation of a logistic chain, i.e. the setup of reliable channels of supply over several waypoints. Human operators on earth will be able to control the mission using novel human-machine interfaces.
In order to build up the logistic chain so-called base-camps will be used in order to bridge large distances between a lander and the rover. Depending on its task, which could be a depot for energy or soil samples, or a relay station for communication, a base-camp can be extended by functional modules. Base-camps, replaceable functional modules, rover and shuttle possess a compatible docking interface so that the shuttle as well as the rover can modify the base-camps using modules delivered to them. Additionally, the modules can be exchanged between shuttle and rover.
The rover is based on the hybrid wheel-step robot Sherpa, the shuttle is based on Asguard, a result of project iMoby. As part of the research agenda the technology readiness of Sherpa will be increased, and along with it the space readiness of individual subcomponents. The mission control center, being the interface between human operator and exploration robot, consists on one hand of an upper body exoskeleton as it was developed in project Capio for controlling the systems, and on the other hand of modern visualization tools such as 3-dimensional multi-projection screens and head-mounted displays (HMDs). The experiences from project IMMI will be used to optimize the control center technology considering psycho-physiological data such as EEG and eye tracking (see video on operator support by embedded Brain Reading as developed in IMMI here).
Technology transfer to terrestrial applications
The robotic technologies of all involved systems developed within the space exploration scenario, including their cooperation, the installation of a logistic chain, and a suitable human-machine interface, will be transferred into the terrestrial application domains search and rescue, management of maritime resources, and rehabilitation. This demonstrates the exchangeability and mutual applicability of technologies from space and terrestrial robotics. In each of the application domains an individual scenario will be defined, demonstrating the transferability of technologies and systems.
Coyote III: Demonstrates Robotic Search and Rescue Scenario
Coyote III - initially build for space exploration tasks, it has shown its multi purpose character in many different scenarios. The rover captivates with high mobility and flexibility, to cope all kinds of situations.
Other than space, Coyote III can also be deployed for search and rescue (SAR) tasks on Earth. Using the camera and laser scanner, the operator gets a clear overview of the surrounding and can safely operate the rover. With a modular system architecture, various sensor and payload modules can be attached to the rover. This allows to help the rescue teams in all kinds of situations and increase the safety of their work. Coyote 3 provides even the possibility to operate fully autonomous and explore extensive areas.
In addition to mapping and visual awareness, the detection and mapping of hazardous materials is an important part of SAR applications. To demonstrate this capabilities, a representative environmental sensor unit was designed and integrated into a modular payload item. The sensor module is equipped with different gas sensors as well as temperature and humidity sensors.
The environment sensor package can detect gas contamination, and help to find gas leaks. This can warn the rescue forces about dangerous areas, for example with high sludge or carbonic oxide gas pollution. During its traverse, the rover automatically generates a surrounding map and highlights the detected gas concentration.
Coyote III: Assembly of Subsystems
The video presents the assembly of Coyote III featuring its dedicated subsystems and their core properties. Coyote III is a micro rover with high mobility performance in unstructured terrains. Equipped with its own power source, on-board sensor suite and on-board computer it is able to perform exploration tasks autonomously. Moreover, the communication subsystem enables the rover to cooperate with other systems. Coyote III is equipped with two standardized electro-mechanical interfaces, allowing to dock additional payload elements, such as standardized payload items or a manipulator. Due to the lightweight and robust structural design of Coyote III, it is possible to apply several kilograms of additional payload to the rover. The modular design approach allows to adapt the rover structure according to specific payload requirements.
TransTerrA: Coyote III Crater Trials
Coyote III masters an artificial lunar crater wall with 45° inclination.
TransTerrA: The robot Coyote III in the snow
Watch Coyote III driving through deep snow within rough terrain
Capio Exoskeleton: Control via biosignals
Demonstration of the Capio exoskeleton control via biosignals: The intended movement of the human operator is detected by the biosignal data processing which triggers the execution of the targeted movement by the exoskeleton. By means of an eye tracker the desired interaction is detected (focusing on a virtual bottle) and by electroencephalographic signals (EEG), the intended movement and the performing limb are determined. Furthermore, by means of electromyographic signals (EMG), the intended movements are verified.
SherpaTT during outdoor runs
SherpaTT demonstrating its ability to keep ist body level during drives through rough terrain.
Intrinsic interactive reinforcement learning: Using error-related potentials
Thanks to human negative feedback, the robot learns from its own misconduct.