Tailored Reality: Perception-aware Scene Restructuring for Adaptive VR Navigation

Zhi Chao Dong, Wenming Wu, Zenghao Xu, Qi Sun, Guanjie Yuan, Ligang Liu, Xiao Ming Fu

    Research output: Contribution to journalArticlepeer-review


    In virtual reality (VR), the virtual scenes are pre-designed by creators. Our physical surroundings, however, comprise significantly varied sizes, layouts, and components. To bridge the gap and further enable natural navigation, recent solutions have been proposed to redirect users or recreate the virtual content. However, they suffer from either interrupted experience or distorted appearances. We present a novel VR-oriented algorithm that automatically restructures a given virtual scene for a user's physical environment. Different from the previous methods, we introduce neither interrupted walking experience nor curved appearances. Instead, a perception-aware function optimizes our retargeting technique to preserve the fidelity of the virtual scene that appears in VR head-mounted displays. Besides geometric and topological properties, it emphasizes the unique first-person view perceptual factors in VR, such as dynamic visibility and objectwise relationships. We conduct both analytical experiments and subjective studies. The results demonstrate our system's versatile capability and practicability for natural navigation in VR: It reduces the virtual space by 40% without statistical loss of perceptual identicality.

    Original languageEnglish (US)
    Article number3470847
    JournalACM Transactions on Graphics
    Issue number5
    StatePublished - Oct 2021


    • Virtual reality
    • perception
    • retargeting

    ASJC Scopus subject areas

    • Computer Graphics and Computer-Aided Design


    Dive into the research topics of 'Tailored Reality: Perception-aware Scene Restructuring for Adaptive VR Navigation'. Together they form a unique fingerprint.

    Cite this