Real-time occlusion handling for dynamic augmented reality using geometric sensing and graphical shading

Suyang Dong, Chen Feng, Vineet R. Kamat

Research output: Contribution to journalArticlepeer-review


The primary challenge in generating convincing augmented reality (AR) graphics is to project three-dimensional (3D) models onto a user's view of the real world and create a temporal and spatial sustained illusion that the virtual and real objects coexist. Regardless of the spatial relationship between the real and virtual objects, traditional AR graphical engines break the illusion of coexistence by displaying the real world merely as a background and superimposing virtual objects on the foreground. This research proposes a robust depth-sensing and frame buffer algorithm for handling occlusion problems in ubiquitous AR applications. A high-accuracy time-of-flight (TOF) camera is used to capture the depth map of the real world in real time. The distance information is processed in parallel using the OpenGL shading language (GLSL) and render to texture (RTT) techniques. The final processing results are written to the graphics frame buffers, allowing accurate depth resolution and hidden surface removal in composite AR scenes. The designed algorithm is validated in several indoor and outdoor experiments using the scalable and modular augmented reality template (SMART) AR framework.

Original languageEnglish (US)
Pages (from-to)607-621
Number of pages15
JournalJournal of Computing in Civil Engineering
Issue number6
StatePublished - Nov 2013


  • Graphical shading
  • Homography mapping
  • Render to texture
  • Stereo mapping
  • Time-of-flight camera
  • Visual simulation

ASJC Scopus subject areas

  • Civil and Structural Engineering
  • Computer Science Applications


Dive into the research topics of 'Real-time occlusion handling for dynamic augmented reality using geometric sensing and graphical shading'. Together they form a unique fingerprint.

Cite this