Publications
ISMAR 2016 - Full paper
Leveraging the User's Face for Absolute Scale Estimation in Handheld Monocular SLAM
Knorr, S., and Kurz, D.
In Proc. IEEE International Symposium on Mixed and Augmented Reality (ISMAR2016), pp. 11-17, Merida, Mexico, 2016.
Abstract
We present an approach to estimate absolute scale in handheld monocular SLAM by simultaneously tracking the user's face with a user-facing camera while a world-facing camera captures the scene for localization and mapping. Given face tracking at absolute scale, two images of a face taken from two different viewpoints enable estimating the translational distance between the two viewpoints in absolute units, such as millimeters. Under the assumption that the face itself stayed stationary in the scene while taking the two images, the motion of the user-facing camera relative to the face can be transferred to the motion of the rigidly connected world-facing camera relative to the scene. This allows determining also the latter motion in absolute units and enables reconstructing and tracking the scene at absolute scale.
As faces of different adult humans differ only moderately in terms of size, it is possible to rely on statistics for guessing the absolute dimensions of a face. For improved accuracy the dimensions of the particular face of the user can be calibrated.
Based on sequences of world-facing and user-facing images captured by a mobile phone, we show for different scenes how our approach enables reconstruction and tracking at absolute scale using a proof-of-concept implementation. Quantitative evaluations against ground truth data confirm that our approach provides absolute scale at an accuracy well suited for different applications. Particularly, we show how our method enables various use cases in handheld Augmented Reality applications that superimpose virtual objects at absolute scale or feature interactive distance measurements.
Video
Copyright © 2008—2021 Daniel Kurz. All rights reserved.