DroneSAR
Author(s) -
Rajkumar Darbar,
Joan Sol Roo,
Thibault Lainé,
Martin Hachet
Publication year - 2019
Publication title -
hal (le centre pour la communication scientifique directe)
Language(s) - English
Resource type - Conference proceedings
ISBN - 978-1-4503-7624-2
DOI - 10.1145/3365610.3365631
Subject(s) - computer science , augmented reality , drone , interface (matter) , computer graphics (images) , human–computer interaction , space (punctuation) , proof of concept , user interface , multimedia , genetics , bubble , maximum bubble pressure method , parallel computing , biology , operating system
Spatial Augmented Reality (SAR) transforms real-world objects into interactive displays by projecting digital content using video projectors. SAR enables co-located collaboration immediately between multiple viewers without the need to wear any special glasses. Unfortunately, one major limitation of SAR is that visual content can only be projected onto its physical supports. As a result, displaying User Interfaces (UI) widgets such as menus and pop-up windows in SAR is very challenging. We are trying to address this limitation by extending SAR space in mid-air. In this paper, we propose Drone-SAR, which extends the physical space of SAR by projecting digital information dynamically on the tracked panels mounted on a drone. DroneSAR is a proof of concept of novel SAR User Interface (UI), which provides support for 2D widgets (i.e., label, menu, interactive tools, etc.) to enrich SAR interactive experience. We also describe the implementation details of our proposed approach.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom