Vortragsdetails

In-hand object localisation using multi–sensor fusion

One of the focus points in the area of robotics and automation engineering is the development of autonomous robot manipulation. The key challenge in this subject is the execution of a robust grasp. Since the gripper's mechanics do not achieve a perfect execution of the grasping as in the simulation process, it is necessary to perform an in – hand localization before further manipulation with the grasped object. Making the robots systems more reliable at the grasping task will widen their application area. Therefore there has been a vast interest in this topic, in-hand localization of grasped object.
Through the years different approaches have been used for in – hand localization. The earliest ones rely on a vision system, others on a force or force – torque sensors and lately tactile sensors are implemented for the purpose of contact sensing. When all of the aforementioned sensors are used individually they have limited accuracy and application. For instance, the vision system is highly affected by the lightning in the working environment. Furthermore the robot arm could be obscuring the vision range, which leads to faulty results. A robot system that could detect the object position and orientation without fully depending on one sensor is favorable. Therefore, in this thesis is presented a method that fuses vision, force/torque and tactile sensor for the purpose of object in-hand localization. This method will increase accuracy and it will provide redundancy if one of the sensor fails. 
The thesis is divided in two parts. In the first part is analyzed each sensor's output independently and a pose hypothesis of the grasped object will be generated. Afterwards the hypotheses are dispatched to the in-hand estimator. The in – hand estimator estimates the most plausible object location and rotation with respect to the world coordinate frame. In particular, the position of the grasped object will be estimated by analyzing the hypothesis form the vision and force/torque sensor and the orientation will be estimated by analyzing the vision and tactile hypothesis.
The underlying work in this thesis can be extended to multifinger hands, such as RH5 hand. Furthermore this work can be used to enable in - hand manipulation in multifinger hands.  
The hardware needed for this thesis's experiments is an UR5 Universal Robot arm with 6 degree of freedom, equipped with 2 finger adaptive ROBOTIQ gripper, FT 300 ROBOTIQ force torque sensor, DFKI - tactile sensor and Xbox 360 kinetic camera.

Veranstaltungsort

Raum Seminarraum 117, Robert-Hooke-Str. 5 in Bremen

In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.

© DFKI GmbH
zuletzt geändert am 31.03.2023
nach oben