This thesis investigates the state estimation of a multi-joint object such as an excavator, by predicting the joint angles using synthetic RGB images as input. The research is to use RGB camera as an economical alternative to expensive retrofitting sensors for state estimation task of a manipulator. Determining the state of the joints from images has always been a challenge in the field of computer vision and robotics. Thus, determining the position of the joint angles is necessary when it comes to automating the manipulators using only the camera as an input source. The thesis explores the use of synthetically generated RGB images of the ex- cavator (manipulator) in various backgrounds, and camera angles due to the lack of availability of real images of the excavator. Two deep learning approaches are investigated: Keypoint estimation and segmentation mask prediction. The re- search aims to compare these approaches to determine the prediction of the joint angles of the excavator. The thesis explores the training and evaluation of both ap- proaches and compares the performance that demonstrates the prediction of joint angles for the given scenario.
Vortragsdetails
Robot state estimation using synthetic RGB images
In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.