A Versatile Stair-Climbing Robot for Search and Rescue Applications
Markus Eich, Felix Grimminger, Frank Kirchner
In 2008 IEEE International Workshop on Safety, Security and Rescue Robotics, (SSRR-2008), 21.10.-24.10.2008, Sendai, IEEE, pages 35-40, Oct/2008. ISBN: 978-1-4244-2032-2/08.

Zusammenfassung (Abstract) :

For disaster mitigation as well as for urban search and rescue (USAR) missions, it is often necessary to place sensors or cameras into dangerous or inaccessible areas to get a better situation awareness for the rescue personnel, before they enter a possibly dangerous area. Robots are predestinated for this task, but the requirements for such mobile systems ar demanding. They should be quick and agile and, at the same time, be able to deal with rough terrain and even climb stairs. The latter is always required if the rescue personnel has to get access to higher floors inside a building. A rugged, waterproof and dust proof corpus, and, if possible, the ability to swim, are only a few of many requirements for such robots. With those requirements in mind, the hybrid legged-wheeled robot ASGUARD was developed. This robot is able to cope with stairs, very rough terrain, and is able to move fast on flat ground. We will describe a versatile adaptive controller, based only on proprioceptive data. An additional inclination feedback is used to make the controller versatile for flat ground as well as for steep slopes and stairs. Provided an attachable float, the robot is able to swim, using the same locomotion approach. By using twenty compliant legs, which are mounted around four individually rotating hip-shafts, we use an abstract model of quadruped locomotion. For the control design four independent pattern generators are used. In contrast to many other hybrid legged-wheeled robots, we use the direct proprioceptive feedback in order to modify the internal control loop, thus adapting the model of the motion pattern. For difficult terrains, like slopes and stairs, we use a phase-adaptive approach which is using directly the proprioceptive data from the legs.


© DFKI GmbH
zuletzt geändert am 27.02.2023
nach oben