Vortragsdetails

PhD Talk: Learning Coordination and Reactivity of Movement Primitives from Demonstration

PhD Talk: Learning Coordination and Reactivity of Movement Primitives from Demonstration

Learning from human demonstrations is a promising way to teach complex motor systems novel skills in an intuitive manner without requiring robotic experts. This will be important for future robots cooperating with us in dynamical environments that were originally designed for human use, where they will have to learn and perform a wide range of challenging tasks. 

While humans naturally use both arms either to speed sequential single-arm tasks up or to enable typical dual-arm tasks in the first place, humanoid robots are not yet capable of learning the temporal and spatial coordination of the two arms from human demonstrations to solve complex dual-arm manipulation tasks autonomously. Instead, current approaches for dual-arm manipulation mainly originate from the robotic control domain and typically require a robotic expert to program behaviors with well-modeled controllers and hand-designed task constraints to solve a set of pre-defined tasks. The state-of-the-art methods within the field of learning from demonstration, however, focus on teaching single-arm tasks, while learning of coordinated dual-arm manipulation tasks remains a little researched topic. 

Also, research within the field of learning from demonstration has mostly investigated the representation, learning and generalization of the demonstrated trajectories, while the integration of sensory information into movement primitives, i.e. a close connection of action and perception, remains little studied. As the important sensory requirements to solve a challenging manipulation task which requires interaction with the environment are provided by human demonstrations, closing the action-perception loop would be beneficial for imitation learning to learn reactive behaviors that adapt movement primitives to the inferred requirements of the demonstrated task. This will be also beneficial for bi-manual tasks where, for example, the exertion of forces on an object is necessary.

The proposed dissertation aims to develop, firstly, a framework to learn the temporal and spatial coordination of movement primitives for dual-arm tasks from human demonstrations. Secondly, novel imitation learning methods will be developed that learn movement primitives integrating both, the trajectories and the relevant sensory information.

As a result, more meaningful motion representations could be provided for robot control and the high task-specific manual programming work for new tasks could be minimized.

Veranstaltungsort

Raum A 1.03, Robert-Hooke-Str. 1 in Bremen

In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.

© DFKI GmbH
zuletzt geändert am 31.03.2023
nach oben