TY - JOUR
T1 - A Deep Learning Sequential Decoder for Transient High-Density Electromyography in Hand Gesture Recognition Using Subject-Embedded Transfer Learning
AU - Azar, Golara Ahmadi
AU - Hu, Qin
AU - Emami, Melika
AU - Fletcher, Alyson
AU - Rangan, Sundeep
AU - Atashzar, S. Farokh
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2024/5/1
Y1 - 2024/5/1
N2 - Hand gesture recognition (HGR) has gained significant attention due to the increasing use of AI-powered human-computer interfaces (HCIs) that can interpret the deep spatiotemporal dynamics of biosignals from the peripheral nervous system, such as surface electromyography (sEMG). These interfaces have a range of applications, including the control of extended reality, agile prosthetics, and exoskeletons. However, the natural variability of sEMG among individuals has led researchers to focus on subject-specific solutions. Deep learning methods, which often have complex structures, are particularly data-hungry and can be time-consuming to train, making them less practical for subject-specific applications. The main contribution of this article is to propose and develop a generalizable, sequential decoder of transient high-density sEMG (HD-sEMG) that achieves 73% average accuracy on 65 gestures for partially-observed subjects through subject-embedded transfer learning (TL), leveraging pre-knowledge of HGR acquired during pretraining. The use of transient HD-sEMG before gesture stabilization allows us to predict gestures with the ultimate goal of counterbalancing system control delays. The results show that the proposed generalized models significantly outperform subject-specific approaches, especially when the training data is limited and there is a significant number of gesture classes. By building on pre-knowledge and incorporating a multiplicative subject-embedded structure, our method comparatively achieves more than 13% average accuracy across partially-observed subjects with minimal data availability. This work highlights the potential of HD-sEMG and demonstrates the benefits of modeling common patterns across users to reduce the need for large amounts of data for new users, enhancing practicality.
AB - Hand gesture recognition (HGR) has gained significant attention due to the increasing use of AI-powered human-computer interfaces (HCIs) that can interpret the deep spatiotemporal dynamics of biosignals from the peripheral nervous system, such as surface electromyography (sEMG). These interfaces have a range of applications, including the control of extended reality, agile prosthetics, and exoskeletons. However, the natural variability of sEMG among individuals has led researchers to focus on subject-specific solutions. Deep learning methods, which often have complex structures, are particularly data-hungry and can be time-consuming to train, making them less practical for subject-specific applications. The main contribution of this article is to propose and develop a generalizable, sequential decoder of transient high-density sEMG (HD-sEMG) that achieves 73% average accuracy on 65 gestures for partially-observed subjects through subject-embedded transfer learning (TL), leveraging pre-knowledge of HGR acquired during pretraining. The use of transient HD-sEMG before gesture stabilization allows us to predict gestures with the ultimate goal of counterbalancing system control delays. The results show that the proposed generalized models significantly outperform subject-specific approaches, especially when the training data is limited and there is a significant number of gesture classes. By building on pre-knowledge and incorporating a multiplicative subject-embedded structure, our method comparatively achieves more than 13% average accuracy across partially-observed subjects with minimal data availability. This work highlights the potential of HD-sEMG and demonstrates the benefits of modeling common patterns across users to reduce the need for large amounts of data for new users, enhancing practicality.
KW - Gesture recognition
KW - high-density EMG
KW - human-computer interface (HCI)
KW - transfer learning (TL)
UR - http://www.scopus.com/inward/record.url?scp=85188527713&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85188527713&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2024.3377247
DO - 10.1109/JSEN.2024.3377247
M3 - Article
AN - SCOPUS:85188527713
SN - 1530-437X
VL - 24
SP - 14778
EP - 14791
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 9
ER -