Robot localisation and mapping in its current state is dominated by the application of optical sensors in conjunction with probabilistic methods for sensor fusion. While attractive in structured environments, especially indoors, this approach does not always yield the desired results in unstructured outdoor environments, and has a high demand on sensor accuracy, density and data processing. The application of information coming from the body of the robot has the potential to augment visual information and improve localisation and mapping accuracy or reduce the requirements on the sensors. An overview of the subject is given in this presentation, followed by some results for localisation and mapping on a leg-wheel hybrid robot. Further, some work in progress on visual-SLAM is presented, including an outlook on the integration of the two modalities.
Embodied Data for Localisation and Mapping
VenueRoom Seminarraum 117, Robert-Hooke-Str. 5 in Bremen
In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.