World-stabilized annotations and virtual scene navigation for remote collaboration
Author(s) -
Steffen Gauglitz,
Benjamin Nuernberger,
Matthew Turk,
Tobias Höllerer
Publication year - 2014
Publication title -
citeseer x (the pennsylvania state university)
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1145/2642918.2647372
Subject(s) - computer science , usability , augmented reality , human–computer interaction , baseline (sea) , interface (matter) , user interface , mobile device , multimedia , visualization , virtual reality , world wide web , artificial intelligence , oceanography , bubble , maximum bubble pressure method , parallel computing , geology , operating system
We present a system that supports an augmented shared visual space for live mobile remote collaboration on physical tasks. The remote user can explore the scene independently of the local user's current camera position and can communicate via spatial annotations that are immediately visible to the local user in augmented reality. Our system operates on off-the-shelf hardware and uses real-time visual tracking and modeling, thus not requiring any preparation or instrumentation of the environment. It creates a synergy between video conferencing and remote scene exploration under a unique coherent interface. To evaluate the collaboration with our system, we conducted an extensive outdoor user study with 60 participants comparing our system with two baseline interfaces. Our results indicate an overwhelming user preference (80%) for our system, a high level of usability, as well as performance benefits compared with one of the two baselines.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom