TY - JOUR
T1 - Image Transformation and CNNs
T2 - A Strategy for Encoding Human Locomotor Intent for Autonomous Wearable Robots
AU - Lee, Ung Hee
AU - Bi, Justin
AU - Patel, Rishi
AU - Fouhey, David
AU - Rouse, Elliott
N1 - Funding Information:
Manuscript received February 24, 2020; accepted June 16, 2020. Date of publication July 7, 2020; date of current version July 17, 2020. This letter was recommended for publication by Associate Editor L. De Michieli and Editor P. Valdastri upon evaluation of the reviewers’ comments. This work was supported by the D. Dan and Betty Kahn Foundation. (Corresponding author: Elliott Rouse.) Ung Hee Lee and Elliott Rouse are with the Department of Mechanical Engineering and the Robotics Institute, University of Michigan, Ann Arbor, MI 48109 USA, and also with the Neurobionics Lab, University of Michigan, Ann Arbor, MI 48109 USA (e-mail: unghee@umich.edu; ejrouse@umich.edu).
Publisher Copyright:
© 2016 IEEE.
PY - 2020/10
Y1 - 2020/10
N2 - Wearable robots have the potential to improve the lives of countless individuals; however, challenges associated with controlling these systems must be addressed before they can reach their full potential. Modern control strategies for wearable robots are predicated on activity-specific implementations, and testing is usually limited to a single, fixed activity within the laboratory (e.g., level ground walking). To accommodate various activities in real-world scenarios, control strategies must include the ability to safely and seamlessly transition between activity-specific controllers. One potential solution to this challenge is to the infer wearer's intent using pattern recognition of locomotion sensor data. To this end, we developed an intent recognition framework implementing convolutional neural networks with image encoding (i.e. spectrogram) that enables prediction of the upcoming locomotor activity of the wearer's next step. In this letter, we describe our intent recognition system, comprised of a mel-spectrogram and subsequent neural network architecture. In addition, we analyzed the effect of sensor locations and modalities on the recognition system, and compared our proposed system to state-of-the-art locomotor intent recognition strategies. We were able to attain high classification performance (error rate: 1.1%), which was comparable or better than previous systems.
AB - Wearable robots have the potential to improve the lives of countless individuals; however, challenges associated with controlling these systems must be addressed before they can reach their full potential. Modern control strategies for wearable robots are predicated on activity-specific implementations, and testing is usually limited to a single, fixed activity within the laboratory (e.g., level ground walking). To accommodate various activities in real-world scenarios, control strategies must include the ability to safely and seamlessly transition between activity-specific controllers. One potential solution to this challenge is to the infer wearer's intent using pattern recognition of locomotion sensor data. To this end, we developed an intent recognition framework implementing convolutional neural networks with image encoding (i.e. spectrogram) that enables prediction of the upcoming locomotor activity of the wearer's next step. In this letter, we describe our intent recognition system, comprised of a mel-spectrogram and subsequent neural network architecture. In addition, we analyzed the effect of sensor locations and modalities on the recognition system, and compared our proposed system to state-of-the-art locomotor intent recognition strategies. We were able to attain high classification performance (error rate: 1.1%), which was comparable or better than previous systems.
KW - deep learning
KW - intent recognition
KW - prosthetics and exoskeletons
KW - sensor fusion
KW - Wearable robots
UR - http://www.scopus.com/inward/record.url?scp=85088693537&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85088693537&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.3007455
DO - 10.1109/LRA.2020.3007455
M3 - Article
AN - SCOPUS:85088693537
SN - 2377-3766
VL - 5
SP - 5440
EP - 5447
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 4
M1 - 9134897
ER -