TY - JOUR
T1 - A training platform for many-dimensional prosthetic devices using a virtual reality environment
AU - Putrino, David
AU - Wong, Yan T.
AU - Weiss, Adam
AU - Pesaran, Bijan
N1 - Funding Information:
The authors would like to acknowledge the work of Dustin Hatfield and Phil Hagerman for help with motion capture, Andrew Schwartz for sharing the macaque mesh, and Peter Loan for help with development of the rhesus macaque kinematic model. This work was sponsored by the Defense Advanced Research Projects Agency (DARPA) MTO under the auspices of Dr. Jack Judy through the Space and Naval Warfare Systems Center, Pacific Grant/Contract No. N66001-11-1-4205 . BP was supported by a Career Award in the Biomedical Sciences from the Burroughs-Wellcome Fund, a Watson Program Investigator Award from NYSTAR, a Sloan Research Fellowship and a McKnight Scholar Award.
Funding Information:
The authors would like to acknowledge the work of Dustin Hatfield and Phil Hagerman for help with motion capture, Andrew Schwartz for sharing the macaque mesh, and Peter Loan for help with development of the rhesus macaque kinematic model. This work was sponsored by the Defense Advanced Research Projects Agency (DARPA) MTO under the auspices of Dr. Jack Judy through the Space and Naval Warfare Systems Center, Pacific Grant/Contract No. N66001-11-1-4205. BP was supported by a Career Award in the Biomedical Sciences from the Burroughs-Wellcome Fund, a Watson Program Investigator Award from NYSTAR, a Sloan Research Fellowship and a McKnight Scholar Award.
Publisher Copyright:
© 2014 Elsevier B.V.
PY - 2015/4/15
Y1 - 2015/4/15
N2 - Brain machine interfaces (BMIs) have the potential to assist in the rehabilitation of millions of patients worldwide. Despite recent advancements in BMI technology for the restoration of lost motor function, a training environment to restore full control of the anatomical segments of an upper limb extremity has not yet been presented. Here, we develop a virtual upper limb prosthesis with 27 independent dimensions, the anatomical dimensions of the human arm and hand, and deploy the virtual prosthesis as an avatar in a virtual reality environment (VRE) that can be controlled in real-time. The prosthesis avatar accepts kinematic control inputs that can be captured from movements of the arm and hand as well as neural control inputs derived from processed neural signals. We characterize the system performance under kinematic control using a commercially available motion capture system. We also present the performance under kinematic control achieved by two non-human primates (Macaca Mulatta) trained to use the prosthetic avatar to perform reaching and grasping tasks. This is the first virtual prosthetic device that is capable of emulating all the anatomical movements of a healthy upper limb in real-time. Since the system accepts both neural and kinematic inputs for a variety of many-dimensional skeletons, we propose it provides a customizable training platform for the acquisition of many-dimensional neural prosthetic control.
AB - Brain machine interfaces (BMIs) have the potential to assist in the rehabilitation of millions of patients worldwide. Despite recent advancements in BMI technology for the restoration of lost motor function, a training environment to restore full control of the anatomical segments of an upper limb extremity has not yet been presented. Here, we develop a virtual upper limb prosthesis with 27 independent dimensions, the anatomical dimensions of the human arm and hand, and deploy the virtual prosthesis as an avatar in a virtual reality environment (VRE) that can be controlled in real-time. The prosthesis avatar accepts kinematic control inputs that can be captured from movements of the arm and hand as well as neural control inputs derived from processed neural signals. We characterize the system performance under kinematic control using a commercially available motion capture system. We also present the performance under kinematic control achieved by two non-human primates (Macaca Mulatta) trained to use the prosthetic avatar to perform reaching and grasping tasks. This is the first virtual prosthetic device that is capable of emulating all the anatomical movements of a healthy upper limb in real-time. Since the system accepts both neural and kinematic inputs for a variety of many-dimensional skeletons, we propose it provides a customizable training platform for the acquisition of many-dimensional neural prosthetic control.
KW - Brain machine interface
KW - Virtual reality environment
UR - http://www.scopus.com/inward/record.url?scp=84979859105&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84979859105&partnerID=8YFLogxK
U2 - 10.1016/j.jneumeth.2014.03.010
DO - 10.1016/j.jneumeth.2014.03.010
M3 - Article
C2 - 24726625
AN - SCOPUS:84979859105
SN - 0165-0270
VL - 244
SP - 68
EP - 77
JO - Journal of Neuroscience Methods
JF - Journal of Neuroscience Methods
ER -