z-logo
open-access-imgOpen Access
Tangible displays for the masses: spatial interaction with handheld displays by using consumer depth cameras
Author(s) -
Martin Spindler,
Wolfgang Büschel,
Charlotte Winkler,
Raimund Dachselt
Publication year - 2013
Publication title -
personal and ubiquitous computing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.416
H-Index - 88
eISSN - 1617-4917
pISSN - 1617-4909
DOI - 10.1007/s00779-013-0730-7
Subject(s) - computer science , human–computer interaction , focus (optics) , mobile device , realization (probability) , space (punctuation) , multimedia , mobile interaction , projection (relational algebra) , presentation (obstetrics) , world wide web , medicine , statistics , physics , mathematics , algorithm , optics , radiology , operating system
Spatially aware handheld displays are a promising approach to interact with complex information spaces in a more natural way by extending the interaction space from the 2D surface to the 3D physical space around them. This is achieved by utilizing their spatial position and orientation for interaction purposes. Technical solutions for spatially tracked displays already exist in research laboratories, e.g., embedded in a tabletop environment. Along with a large stationary screen, such multi-display systems provide a rich design space with a variety of benefits to users, e.g., the explicit support of co-located parallel work and collaboration. As we see a great future in the underlying interaction principles, the question is how the technology can be made accessible to the public. With our work, we want to address this issue. In the long term, we envision a low-cost tangible display ecosystem that is suitable for everyday usage and supports both active displays (e.g., the iPad) and passive projection media (e.g., paper screens and everyday objects such as a mug). The two major contributions of this article are a presentation of an exciting design space and a requirement analysis regarding its technical realization with special focus on a broad adoption by the public. In addition, we present a proof of concept system that addresses one technical aspect of this ecosystem: the spatial tracking of tangible displays with a consumer depth camera (Kinect).

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom