Learning Intelligent Motions for Kinematically Complex Robots for Exploration in Space

Example morphology for a legged robot with manipulation capabilities (Source: Marc Manz, DFKI GmbH)
Example morphology for a legged robot with manipulation capabilities (Source: Marc Manz, DFKI GmbH)

In the LIMES project, a highly mobile multi-legged walking robot with the ability to straighten up the upper body in order to use the front extremities as manipulation devices will be developed. In future extraterrestrial missions, such a system will allow taking soil samples from difficult-to-access regions or assembling and maintaining infrastructure on rough and unstructured surfaces of celestial bodies. Beside the mechatronic development of the robot, the project focuses on generating and optimizing different locomotion behaviors for traversing varying surface structures and subsoils with the aid of a simulation environment and machine learning methods.

Duration: 01.05.2012 till 30.04.2016
Donee: DFKI GmbH & University of Bremen
Grant number: This project is funded by the Space Agency of the German Aerospace Center with federal funds of the Federal Ministry of Economics and Technology (BMWi) in accordance with the parliamentary resolution of the German Parliament, grant no. 50 RA 1218 (DFKI) und 50 RA 1219 (University of Bremen).
Partner: University of Bremen
Team: Team I - System Design
Team II - Hardware Architectures
Team V - Behavior Control & Simulation
Team VII - Sustained Interaction & Learning
Application Field: Space Robotics
Related Projects: BesMan
Behaviors for Mobile Manipulation (05.2012- 07.2016)
A Semi-Autonomous Free-Climbing Robot for the Exploration of Crater Walls and Bottoms (07.2007- 11.2010)
Reconfigurable Integrated Multi Robot Exploration System (09.2009- 12.2012)
Intelligent Structures for Mobile Robots (05.2010- 08.2013)
Virtual Crater
Development of Virtual Simulation and Demonstration Environment for Planetary Exploration with Focus on Extraterrestrial Crater (05.2009- 08.2012)
Virtual state prediction for Groups of reactive autonomous Robots (04.2011- 06.2014)
Related Robots: MANTIS
Multi-legged Manipulation and Locomotion System
Related Software: MARS
Machina Arte Robotum Simulans
An add-on for Blender allowing editing and exporting of robots for the MARS simulation
Reconfigurable Signal Processing and Classification Environment
Behavior Optimization and Learning for Robots
Node Level Data Link Communication
Federal Ministry of Economics and Technology
German Aerospace Center e.V.

Project details

The legged robot Mantis standing in an upright position to exploit its dual-arm manipulation capabilities. (Photo: Marc Manz, DFKI GmbH)
Mantis in a six-legged walking pose for stable locomotion in unstructured environments. (Photo: Marc Manz, DFKI GmbH)
Morphology and degrees of freedom of the robot Mantis. (Photo: Marc Manz, DFKI GmbH)
ZynqBrain: Central Control Unit of the robot, which is capable to control the whole locomotor system utilizing an FPGA with dedicated logic to interface with all peripherals. The necessary IP-Cores and associated Linux drivers were generated using the reSPACE Framework. (Source: DFKI GmbH)

Due to their large number of degrees of freedom which are distributed over several extremities, walking robots are able to perform a multitude of different walking patterns and to adapt their posture to the surface structure in order to maneuver securely and efficiently on rough surfaces. In addition, their manifold sensorial equipment makes a visual as well as a tactile perception of their environment possible, thus enabling them to gain information about the conditions of the substrate they are walking on. On the basis of this knowledge, the best locomotion behavior out of a set of previously optimized behaviors for varying situations can be selected.
The flexible locomotor system furthermore offers the possibility to use the legs for the manipulation of objects if these are equipped with the appropriate gripping devices. Here, too, a multi-modal sensory infrastructure is essential for coping with these tasks.
In the project LIMES a multi-legged robot will be developed which provides the mechatronical capabilities to manage the tasks described above. In parallel to the development of the hardware, precise simulation models for the subsystems of the robot will be developed in order to be able to simulate the behavior of the overall system with high accuracy.
Thereon, the virtual system can be utilized to generate and optimize different locomotion behaviors for various terrains with regard to diverse criteria (e.g. energy consumption, speed, etc.) by means of machine learning methods.
If the performance of a behavior has reached a high quality it can be transferred to the real system in order to proceed with the optimization.
Afterwards, the learned behaviors will be stored in a “behavior library” and can be selected and activated by the system for an optimal locomotion depending on its current situation.
In addition, the already learned behaviors can be used as a starting point for further learning procedures to generate new behaviors for other terrains or optimization criteria.
Moreover, the simulation allows to analyze the learned locomotion behaviors for the system under low gravity, as on Moon, and to optimize them for such conditions.



Animation von Mantis beim Klettern in einer Kraterumgebung, um Infrastruktur zu warten.

LIMES – Learning Intelligent Motions

Animation von Mantis beim Durchführen eines Außenbordeinsatzes (EVA, Extra Vehicular Activity) außerhalb eines lunaren Habitats für Menschen.






Back to the list of projects
last updated 15.01.2018
to top