Abstract
In virtual reality (VR), the virtual scenes are pre-designed by creators. Our physical surroundings, however, comprise significantly varied sizes, layouts, and components. To bridge the gap and further enable natural navigation, recent solutions have been proposed to redirect users or recreate the virtual content. However, they suffer from either interrupted experience or distorted appearances. We present a novel VR-oriented algorithm that automatically restructures a given virtual scene for a user's physical environment. Different from the previous methods, we introduce neither interrupted walking experience nor curved appearances. Instead, a perception-aware function optimizes our retargeting technique to preserve the fidelity of the virtual scene that appears in VR head-mounted displays. Besides geometric and topological properties, it emphasizes the unique first-person view perceptual factors in VR, such as dynamic visibility and objectwise relationships. We conduct both analytical experiments and subjective studies. The results demonstrate our system's versatile capability and practicability for natural navigation in VR: It reduces the virtual space by 40% without statistical loss of perceptual identicality.
Original language | English (US) |
---|---|
Article number | 3470847 |
Journal | ACM Transactions on Graphics |
Volume | 40 |
Issue number | 5 |
DOIs | |
State | Published - Oct 2021 |
Keywords
- Virtual reality
- perception
- retargeting
ASJC Scopus subject areas
- Computer Graphics and Computer-Aided Design