TY - GEN
T1 - A Wearable Assistive Technology for the Visually Impaired with Door Knob Detection and Real-Time Feedback for Hand-to-Handle Manipulation
AU - Niu, Liang
AU - Qian, Cheng
AU - Rizzo, John Ross
AU - Hudson, Todd
AU - Li, Zichen
AU - Enright, Shane
AU - Sperling, Eliot
AU - Conti, Kyle
AU - Wong, Edward
AU - Fang, Yi
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/7/1
Y1 - 2017/7/1
N2 - The visually impaired are consistently faced with mobility restrictions due to the lack of truly accessible environments. Even in structured settings, people with low vision may still have trouble navigating efficiently and safely due to hallway and threshold ambiguity. Assistive technologies that are currently available do not provide door and door-handle object detections nor do they concretely help the visually impaired reaching towards the object. In this paper, we propose an AI-driven wearable assistive technology that integrates door handle detection, user's real-time hand position in relation to this targeted object, and audio feedback for 'joy stick-like command' for acquisition of the target and subsequent hand-to-handle manipulation. When fully envisioned, this platform will help end users locate doors and door handles and reach them with feedback, enabling them to travel safely and efficiently when navigating through environments with thresholds. Compared to the usual computer vision models, the one proposed in this paper requires significantly fewer computational resources, which allows it to pair with a stereoscopic camera running on a small graphics processing unit (GPU). This permits us to take advantage of its convenient portability. We also introduce a dataset containing different types of door handles and door knobs with bounding-box annotations, which can be used for training and testing in future research.
AB - The visually impaired are consistently faced with mobility restrictions due to the lack of truly accessible environments. Even in structured settings, people with low vision may still have trouble navigating efficiently and safely due to hallway and threshold ambiguity. Assistive technologies that are currently available do not provide door and door-handle object detections nor do they concretely help the visually impaired reaching towards the object. In this paper, we propose an AI-driven wearable assistive technology that integrates door handle detection, user's real-time hand position in relation to this targeted object, and audio feedback for 'joy stick-like command' for acquisition of the target and subsequent hand-to-handle manipulation. When fully envisioned, this platform will help end users locate doors and door handles and reach them with feedback, enabling them to travel safely and efficiently when navigating through environments with thresholds. Compared to the usual computer vision models, the one proposed in this paper requires significantly fewer computational resources, which allows it to pair with a stereoscopic camera running on a small graphics processing unit (GPU). This permits us to take advantage of its convenient portability. We also introduce a dataset containing different types of door handles and door knobs with bounding-box annotations, which can be used for training and testing in future research.
UR - http://www.scopus.com/inward/record.url?scp=85046288720&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85046288720&partnerID=8YFLogxK
U2 - 10.1109/ICCVW.2017.177
DO - 10.1109/ICCVW.2017.177
M3 - Conference contribution
AN - SCOPUS:85046288720
T3 - Proceedings - 2017 IEEE International Conference on Computer Vision Workshops, ICCVW 2017
SP - 1500
EP - 1508
BT - Proceedings - 2017 IEEE International Conference on Computer Vision Workshops, ICCVW 2017
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 16th IEEE International Conference on Computer Vision Workshops, ICCVW 2017
Y2 - 22 October 2017 through 29 October 2017
ER -