z-logo
open-access-imgOpen Access
A mobile indoor navigation system interface adapted to vision-based localization
Author(s) -
Andreas Möller,
Matthias Kranz,
Robert Huitl,
Stefan Diewald,
Luis Roalter
Publication year - 2012
Publication title -
kth publication database diva (kth royal institute of technology)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1145/2406367.2406372
Subject(s) - computer science , scalability , interface (matter) , computer vision , human–computer interaction , mobile device , mobile computing , navigation system , real time computing , artificial intelligence , telecommunications , database , world wide web , bubble , maximum bubble pressure method , parallel computing
Vision-based approaches for mobile indoor localization do not rely on the infrastructure and are therefore scalable and cheap. The particular requirements to a navigation user interface for a vision-based system, however, have not beeninvestigated so far. Such interfaces should adapt to localization accuracy, which strongly relies on distinctive reference images, and other factors, such as the phone's pose. If necessary, the system should motivate the user to point at distinctive regions with the smartphone to improve localization quality. We present a combined interface of Virtual Reality (VR) and Augmented Reality (AR) elements with indicators that communicate and ensure localization accuracy. In an evaluation with 81 participants, we found that AR was preferred in case of reliable localization, but with VR, navigation instructions were perceived more accurate in case of localization and orientation errors. The additional indicators showed a potential for making users chooseGodkänd; 2012; 20121012 (matkra

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom