Ne-motion: Visual analysis of stroke patients using motion sensor networks

Rodrigo Colnago Contreras, Avinash Parnandi, Bruno Gomes Coelho, Claudio Silva, Heidi Schambra, Luis Gustavo Nonato

Research output: Contribution to journalArticlepeer-review


A large number of stroke survivors suffer from a significant decrease in upper extremity (UE) function, requiring rehabilitation therapy to boost recovery of UE motion. Assessing the efficacy of treatment strategies is a challenging problem in this context, and is typically accomplished by observing the performance of patients during their execution of daily activities. A more detailed assessment of UE impairment can be undertaken with a clinical bedside test, the UE Fugl–Meyer Assessment, but it fails to examine compensatory movements of functioning body segments that are used to bypass impairment. In this work, we use a graph learning method to build a visualization tool tailored to support the analysis of stroke patients. Called NE-Motion, or Network Environment for Motion Capture Data Analysis, the proposed analytic tool handles a set of time series captured by motion sensors worn by patients so as to enable visual analytic resources to identify abnormalities in movement patterns. Developed in close collaboration with domain experts, NE-Motion is capable of uncovering important phenomena, such as compensation while revealing differences between stroke patients and healthy individuals. The effectiveness of NE-Motion is shown in two case studies designed to analyze particular patients and to compare groups of subjects.

Original languageEnglish (US)
Article number4482
Issue number13
StatePublished - Jul 1 2021


  • Graph learning
  • Set theory
  • Stroke
  • Visual analytics
  • Visualization

ASJC Scopus subject areas

  • Analytical Chemistry
  • Information Systems
  • Atomic and Molecular Physics, and Optics
  • Biochemistry
  • Instrumentation
  • Electrical and Electronic Engineering


Dive into the research topics of 'Ne-motion: Visual analysis of stroke patients using motion sensor networks'. Together they form a unique fingerprint.

Cite this