Deep Hand
Deep sensing and deep learning for myocontrol of the upper limb
Within the project Deep Hand, self-powered hand prostheses are developed with the aim to help amputees regain some of the lost upper-limb functionalities. These should ideally be operated by using the activities of the remaining muscles. In order to achieve this, the Robotics Research Group of the University of Bremen develops an A-mode ultrasound scanning system for the detection of deep muscle activities. The acquired ultrasound data, in conjunction with the data from surface electromyography (sEMG), tactile sensing, strain sensing and electrical impedance tomography (EIT), will be fused. Deep learning methods are then used to improve the reliability of the prostheses control.
Duration: | 16.03.2020 till 15.03.2022 |
Donee: | University of Bremen |
Sponsor: | DFG German Research Foundation |
Grant number: | DFG Project Number 272314643 |
Partner: |
DLR German Aerospace Center University of Bielefeld (CITEC) University of Siegen |
Application Field: | Assistance- and Rehabilitation Systems |
Project details
Presently, most of the robotic prosthetic hands are based on surface EMG technology. Such signals usually change according to environmental and bodily conditions, which can be disruptive and can cause an unreliable control.
To read the intent of the amputee more precisely, the DeepHand project aims to introduce various sensing approaches, including detecting the surface muscle activities by using sEMG, tactile sensor and detecting deep muscle activities by using EIT and A-mode ultrasound scanning.
The Robotics Research Group of the University Bremen will focus on the development of a wearable system that detects the deep muscle activities by using a set of A-mode ultrasound sensors.
Due to its compact size, the A-mode ultrasound sensor is ideal to detect muscle activities, rather than the B-mode scanning which has proved as an effective approach in monitoring the muscle activities but has a bulky size. The data from the different sensing technologies will be fused and furtherly investigated by using machine learning approaches. As a result, the intent of the amputee can be read more precisely and the corresponding control of the robotic prostheses is more reliable.