TY - GEN
T1 - Resynthesizing reality
T2 - ACM SIGGRAPH 2017 Talks - International Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2017
AU - Haddad, Don Derek
AU - Dublon, Gershon
AU - Mayton, Brian
AU - Russell, Spencer
AU - Xiao, Xiao
AU - Perlin, Ken
AU - Paradiso, Joseph A.
N1 - Publisher Copyright:
© Copyright 2017 Authors.
PY - 2017/7/30
Y1 - 2017/7/30
N2 - The rise of ubiquitous sensing enables the harvesting of massive amounts of data from the physical world. This data is often used to drive the behavior of devices, and when presented to users, it is most commonly visualized quantitatively, as graphs and charts. Another approach for the representation of sensor network data presents the data within a rich, virtual environment. These scenes can be generated based on the physical environment, and their appearance can change based on the state of sensor nodes. By freely exploring these environments, users gain a vivid, multi-modal, and experiential perspective into large, multi-dimensional datasets. This paper presents the concept of "Resynthesizing Reality" through a case study we have created based on a network of environmental sensors deployed at a large-scale wetland restoration site. We describe the technical implementation of our system, present techniques to visualize sensor data within the virtual environment, and discuss potential applications for such Resynthesized Realities.
AB - The rise of ubiquitous sensing enables the harvesting of massive amounts of data from the physical world. This data is often used to drive the behavior of devices, and when presented to users, it is most commonly visualized quantitatively, as graphs and charts. Another approach for the representation of sensor network data presents the data within a rich, virtual environment. These scenes can be generated based on the physical environment, and their appearance can change based on the state of sensor nodes. By freely exploring these environments, users gain a vivid, multi-modal, and experiential perspective into large, multi-dimensional datasets. This paper presents the concept of "Resynthesizing Reality" through a case study we have created based on a network of environmental sensors deployed at a large-scale wetland restoration site. We describe the technical implementation of our system, present techniques to visualize sensor data within the virtual environment, and discuss potential applications for such Resynthesized Realities.
KW - Ubiquitous Sensing
KW - Virtual Environments
UR - http://www.scopus.com/inward/record.url?scp=85033385086&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85033385086&partnerID=8YFLogxK
U2 - 10.1145/3084363.3085027
DO - 10.1145/3084363.3085027
M3 - Conference contribution
AN - SCOPUS:85033385086
T3 - ACM SIGGRAPH 2017 Talks, SIGGRAPH 2017
BT - ACM SIGGRAPH 2017 Talks, SIGGRAPH 2017
PB - Association for Computing Machinery, Inc
Y2 - 30 July 2017 through 3 August 2017
ER -