Transfer Learning Enhanced Cross-Subject Hand Gesture Recognition with sEMG

作者:Zhang, Shenyilang; Fang, Yinfeng*; Wan, Jiacheng; Jiang, Guozhang; Li, Gongfa
来源:Journal of Medical and Biological Engineering, 2023, 43(6): 672-688.
DOI:10.1007/s40846-023-00837-5

摘要

PurposeThis study explores the emerging field of human physical action classification within human-machine interaction (HMI), with potential applications in assisting individuals with disabilities and robotics. The research focuses on addressing the challenges posed by diverse sEMG signals, aiming for improved cross-subject hand gesture recognition.MethodsThe proposed approach utilizes deep transfer learning technology, employing multi-feature images (MFI) generated through grayscale conversion and RGB mapping of numerical matrices. These MFIs are fed as input into a fine-tuned AlexNet model. Two databases, ISRMyo-I and Ninapro DB1, are employed for experimentation. Rigorous testing is conducted to identify optimal parameters and feature combinations. Data augmentation techniques are applied, doubling the MFI dataset. Cross-subject experiments encompass six wrist gestures from Ninapro DB1 and thirteen gestures from ISRMyo-I.ResultsThe study demonstrates substantial performance enhancements. In Ninapro DB1, the mean accuracy achieves 86.16%, showcasing a 13.25% improvement over the best-performing traditional decoding method. Similarly, in ISRMyo-I, a mean accuracy of 70.41% is attained, signifying a 7.4% increase in accuracy compared to traditional methods.ConclusionThis research establishes a robust framework capable of mitigating cross-user differences in hand gesture recognition based on sEMG signals. By employing deep transfer learning techniques and multi-feature image processing, the study significantly enhances the accuracy of cross-subject hand gesture recognition. This advancement holds promise for enriching human-machine interaction and extending the practical applications of this technology in assisting disabled individuals and robotics.

全文