Wearable computing systems, i.e. virtual and augmented reality (VR/AR), are widely expected to be the next major computing platform. These systems strive to generate perceptually realistic user experiences that seamlessly blend physical and digital content to unlock unprecedented user interfaces and applications. Due to the fact that the primary interface between a wearable computer and a user is typically a near-eye display, it is crucial that these displays deliver perceptually realistic and visually comfortable experiences. However, current generation near-eye displays suffer from limited resolution and color fidelity, they suffer from the vergence–accommodation conflict impairing visual comfort, they do not support all depth cues that the human visual system relies on, and AR displays typically do not support mutually consistent occlusions between physical and digital imagery. In this chapter, we review the state of the art of perceptually-driven computational near-eye displays addressing these and other challenges.