Human hands effortlessly navigate a diverse range of objects, adjusting grasp strength, finger placement, and overall hand configuration. In contrast, implementing a comparable level of dexterity in robot hands demands a precise understanding of not only the physical mechanics involved but also the ability to interpret and respond to a multitude of sensory inputs in real time. This understanding can come from an analysis conducted using metrics that measure the quality aspects of a handle using that sensory input. In this work, the human grasping process is analyzed in order to find out which grasping metrics are optimized during the process. Constrained by limited information in the form of monocular RGB images capturing hands grasping YCB objects, we opted for a minimalist approach. Using the kypt transformer neural network, we estimated poses and translations, addressing the challenge posed by the reduced data scope of RGB images compared to motion capture data.
Seven metrics were systematically evaluated under these constraints to identify optimized grasp aspects in this minimal setup. To determine the forces needed during grasping, we applied the MANO model for hands and used preexisting surface meshes for objects. Specifically, volume meshes within the hydroelastic contact model enable the estimation of forces and torques by analyzing intersections between interacting hand and object meshes. These estimated forces underwent metric analysis to identify optimized parameters during the human grasping process. Additionally, we introduced a new metric called the volume of grasp polygon (VGP), which assesses the volumetric share of a grasped object covered by the grasp.
Vortragsdetails
Evaluation of metrics for human-like grips
In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.