Optimizing recording depth to decode movement goals from cortical field potentials

David A. Markowitz, Yan T. Wong, Charles M. Gray, Bijan Pesaran

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Brain-machine interfaces decode movement goals and trajectories from neural activity that is recorded using chronically-implanted microelectrode arrays. Fixed geometry arrays are limited for this purpose because electrodes cannot be moved after implantation, and optimization of the electrode recording configuration requires the re-implantation of a new array. Here, we optimize local field potential (LFP) recordings using a chronically-implanted microelectrode array with electrodes that can be moved after implantation. In a series of recordings, we systematically vary the depth of each electrode in the frontal eye field of a monkey performing eye movements. We find that a decoder predicting movement goals from LFP activity on 32 electrodes provides information rates as high as 5.0 bits/s and that performance varies significantly with recording depth. These results indicate that recording depth is a critical parameter for the performance of LFP-based brain-machine interfaces that decode movement goals.

Original languageEnglish (US)
Title of host publication2011 5th International IEEE/EMBS Conference on Neural Engineering, NER 2011
Pages593-596
Number of pages4
DOIs
StatePublished - 2011
Event2011 5th International IEEE/EMBS Conference on Neural Engineering, NER 2011 - Cancun, Mexico
Duration: Apr 27 2011May 1 2011

Publication series

Name2011 5th International IEEE/EMBS Conference on Neural Engineering, NER 2011

Other

Other2011 5th International IEEE/EMBS Conference on Neural Engineering, NER 2011
Country/TerritoryMexico
CityCancun
Period4/27/115/1/11

ASJC Scopus subject areas

  • General Neuroscience

Fingerprint

Dive into the research topics of 'Optimizing recording depth to decode movement goals from cortical field potentials'. Together they form a unique fingerprint.

Cite this