Abstract
The primary challenge in generating convincing augmented reality (AR) graphics is to project three-dimensional (3D) models onto a user's view of the real world and create a temporal and spatial sustained illusion that the virtual and real objects coexist. Regardless of the spatial relationship between the real and virtual objects, traditional AR graphical engines break the illusion of coexistence by displaying the real world merely as a background and superimposing virtual objects on the foreground. This research proposes a robust depth-sensing and frame buffer algorithm for handling occlusion problems in ubiquitous AR applications. A high-accuracy time-of-flight (TOF) camera is used to capture the depth map of the real world in real time. The distance information is processed in parallel using the OpenGL shading language (GLSL) and render to texture (RTT) techniques. The final processing results are written to the graphics frame buffers, allowing accurate depth resolution and hidden surface removal in composite AR scenes. The designed algorithm is validated in several indoor and outdoor experiments using the scalable and modular augmented reality template (SMART) AR framework.
Original language | English (US) |
---|---|
Pages (from-to) | 607-621 |
Number of pages | 15 |
Journal | Journal of Computing in Civil Engineering |
Volume | 27 |
Issue number | 6 |
DOIs | |
State | Published - Nov 2013 |
Keywords
- Graphical shading
- Homography mapping
- Render to texture
- Stereo mapping
- Time-of-flight camera
- Visual simulation
ASJC Scopus subject areas
- Civil and Structural Engineering
- Computer Science Applications