TY - GEN
T1 - A virtual reality interface to test wearable electronic travel AIDS for the visually impaired
AU - Boldini, Alain
AU - Ma, Xinda
AU - Rizzo, John Ross
AU - Porfiri, Maurizio
N1 - Funding Information:
This research was supported by the National Science Foundation under Grant Nos. ECCS-1928614 and CNS-1952180.
Publisher Copyright:
© 2021 SPIE.
PY - 2021
Y1 - 2021
N2 - Visual impairment represents a critical challenge for our society with 285 million affect worldwide; alarmingly, the prevalence is expected to triple by 2050. Supporting mobility is a chief priority for assistive technologies. In recent years, the integration of computer vision and haptic technologies has led to a number of wearable electronic travel aids (ETAs). Previously, we have proposed an ETA comprised of a computer vision system and a wearable haptic device in the form of a belt. The belt encompasses a two-by-five array of piezoelectric-based macro-fiber composite (MFC) actuators, which can generate vibrations on the abdomen when an oscillating voltage is applied across their electrodes. The computer vision system identifies position and distance of surrounding obstacles egocentrically and drives the actuators relative to the salience of the potential hazard(s). Despite promising pilots, the design, control, and optimization of the ETA requires substantial, potentially high-risk, and tedious training and testing to accommodate patient-specific behavioral idiosyncrasies and a variety of visual impairments. To address these issues, we employ a virtual reality (VR) platform that offers simulations of visual impairment by disease type and severity with front-end control. We review our early work on the first three visual impairments piloted in the platform, each with three levels of severity: mild, moderate and severe. The VR environment is interfaced with the ETA, which provides feedback to the user based on the position of virtual obstacles. These simulations allow safe, controlled, repeatable experiments with ETAs that can be performed with varying degrees of visual perception. Our framework can become a paradigm for the development and testing of ETAs, with other potential applications in disability awareness, education, and training.
AB - Visual impairment represents a critical challenge for our society with 285 million affect worldwide; alarmingly, the prevalence is expected to triple by 2050. Supporting mobility is a chief priority for assistive technologies. In recent years, the integration of computer vision and haptic technologies has led to a number of wearable electronic travel aids (ETAs). Previously, we have proposed an ETA comprised of a computer vision system and a wearable haptic device in the form of a belt. The belt encompasses a two-by-five array of piezoelectric-based macro-fiber composite (MFC) actuators, which can generate vibrations on the abdomen when an oscillating voltage is applied across their electrodes. The computer vision system identifies position and distance of surrounding obstacles egocentrically and drives the actuators relative to the salience of the potential hazard(s). Despite promising pilots, the design, control, and optimization of the ETA requires substantial, potentially high-risk, and tedious training and testing to accommodate patient-specific behavioral idiosyncrasies and a variety of visual impairments. To address these issues, we employ a virtual reality (VR) platform that offers simulations of visual impairment by disease type and severity with front-end control. We review our early work on the first three visual impairments piloted in the platform, each with three levels of severity: mild, moderate and severe. The VR environment is interfaced with the ETA, which provides feedback to the user based on the position of virtual obstacles. These simulations allow safe, controlled, repeatable experiments with ETAs that can be performed with varying degrees of visual perception. Our framework can become a paradigm for the development and testing of ETAs, with other potential applications in disability awareness, education, and training.
KW - Assistive technologies
KW - Disability simulation
KW - Electronic travel AIDS
KW - Virtual reality
KW - Visual impairment
UR - http://www.scopus.com/inward/record.url?scp=85107220025&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107220025&partnerID=8YFLogxK
U2 - 10.1117/12.2581441
DO - 10.1117/12.2581441
M3 - Conference contribution
AN - SCOPUS:85107220025
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - Nano-, Bio-, Info-Tech Sensors and Wearable Systems
A2 - Kim, Jaehwan
PB - SPIE
T2 - Nano-, Bio-, Info-Tech Sensors and Wearable Systems 2021
Y2 - 22 March 2021 through 26 March 2021
ER -