Indoor trajectory identification: Snapping with uncertainty

Richard Wang, Ravi Shroff, Yilong Zha, Srinivasan Seshan, Manuela Veloso

Research output: Chapter in Book/Report/Conference proceedingConference contribution


We consider the problem of indoor human trajectory identification using odometry data from smartphone sensors. Given a segmented trajectory, a simplified map of the environment, and a set of error thresholds, we implement a map-matching algorithm in a urban setting and analyze the accuracy of the resulting path. We also discuss aggregation of user step data into a segmented trajectory. Besides providing an interesting application of learning human motion in a constrained environment, we examine how the uncertainty of the snapped trajectory varies with path length. We demonstrate that as new segments are added to a path, the number of possibilities for earlier segments is monotonically non-increasing. Applications of this work in an urban setting are discussed, as well as future plans to develop a formal theory of odometry-based map-matching.

Original languageEnglish (US)
Title of host publicationIROS Hamburg 2015 - Conference Digest
Subtitle of host publicationIEEE/RSJ International Conference on Intelligent Robots and Systems
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781479999941
StatePublished - Dec 11 2015
EventIEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2015 - Hamburg, Germany
Duration: Sep 28 2015Oct 2 2015

Publication series

NameIEEE International Conference on Intelligent Robots and Systems
ISSN (Print)2153-0858
ISSN (Electronic)2153-0866


OtherIEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2015

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications


Dive into the research topics of 'Indoor trajectory identification: Snapping with uncertainty'. Together they form a unique fingerprint.

Cite this