Abstract
Real walking offers higher immersive presence for virtual reality (VR) applications than alternative locomotive means such as walking-in-place and external control gadgets, but needs to take into consideration different room sizes, wall shapes, and surrounding objects in the virtual and real worlds. Despite perceptual study of impossible spaces and redirected walking, there are no general methods to match a given pair of virtual and real scenes. We propose a system to match a given pair of virtual and physical worlds for immersive VR navigation. We first compute a planar map between the virtual and physical floor plans that minimizes angular and distal distortions while conforming to the virtual environment goals and physical environment constraints. Our key idea is to design maps that are globally surjective to allow proper folding of large virtual scenes into smaller real scenes but locally injective to avoid locomotion ambiguity and intersecting virtual objects. From these maps we derive altered rendering to guide user navigation within the physical environment while retaining visual fidelity to the virtual environment. Our key idea is to properly warp the virtual world appearance into real world geometry with sufficient quality and performance. We evaluate our method through a formative user study, and demonstrate applications in gaming, architecture walkthrough, and medical imaging.
Original language | English (US) |
---|---|
Article number | a64 |
Journal | ACM Transactions on Graphics |
Volume | 35 |
Issue number | 4 |
DOIs | |
State | Published - Jul 11 2016 |
Event | ACM SIGGRAPH 2016 - Anaheim, United States Duration: Jul 24 2016 → Jul 28 2016 |
Keywords
- Camera projection
- Geometry morphing
- Head-mounted display
- Human perception
- Planar map
- Real-time rendering
- Redirected walking
- Virtual reality
- Warped space
ASJC Scopus subject areas
- Computer Graphics and Computer-Aided Design