Publications


Computers & Graphics - Journal paper


Handheld Augmented Reality involving gravity measurements

Kurz, D., and BenHimane S.

In Computers & Graphics, Special Section on Augmented Reality, pp. 866–883, November 2012 (Volume 36, Issue 7).

Handheld Augmented Reality involving gravity measurements

Abstract

This article is a revised version of an earlier work on Gravity-Aware Handheld Augmented Reality (AR), that investigates how different stages in handheld AR applications can benefit from knowing the direction of the gravity measured with inertial sensors. It presents approaches to improve the description and matching of feature points, detection and tracking of planar templates, and the visual quality of the rendering of virtual 3D objects by incorporating the gravity vector. In handheld AR, both the camera and the display are located in the user's hand and therefore can be freely moved. The pose of the camera is generally determined with respect to piecewise planar objects that have a static and known orientation with respect to gravity.

In the presence of (close to) vertical surfaces, we show how Gravity-Aligned Feature Descriptors (GAFD) improve the initialization of tracking algorithms relying on feature point descriptor-based approaches in terms of quality and performance. For (close to) horizontal surfaces, we propose to use the gravity vector to rectify the camera image and detect and describe features in the rectified image. The resulting Gravity-Rectified Feature Descriptors (GREFD) provide an improved precision-recall characteristic and enable faster initialization, in particular under steep viewing angles. Gravity-rectified camera images also allow for real-time 6 DoF pose estimation using an edge-based object detection algorithm handling only 4 DoF similarity transforms. Finally, the rendering of virtual 3D objects can be made more realistic and plausible by taking into account the orientation of the gravitational force in addition to the relative pose between the handheld device and a real object.

In comparison to the original paper, this work provides a more elaborate evaluation of the presented algorithms. We propose a method enabling the evaluation of inertial-sensor aided visual tracking methods without real inertial sensor data. By synthesizing gravity measurements from ground truth camera poses, we benchmark our algorithms on a large existing dataset. Based on this approach, we also develop and evaluate a gravity-adaptive approach that performs image-rectification only when beneficial.


[BibTex (bib)]  [@ sciencedirect.com]  


Video


Related publications

Gravity-Aware Handheld Augmented Reality

Kurz, D., and BenHimane S.
Gravity-Aware Handheld Augmented Reality
In Proc. IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR2011), pp. 111-120, Basel, Switzerland, 2011. (Best Paper Award Nominee)

  [Preprint (pdf)]  [Further information]


Inertial sensor-aligned visual feature descriptors

Kurz, D., and BenHimane S.
Inertial sensor-aligned visual feature descriptors
In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR2011), pp. 161-166, Colorado Springs, USA, 2011.

  [Preprint (pdf)]  [Further information]



Copyright © 2008—2021 Daniel Kurz. All rights reserved.