z-logo
open-access-imgOpen Access
Multimodal multiplayer tabletop gaming
Author(s) -
Edward Tse,
Saul Greenberg,
Chia Shen,
Clifton Forlines
Publication year - 2007
Publication title -
computers in entertainment
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 30
eISSN - 1544-3574
pISSN - 1544-3981
DOI - 10.1145/1279540.1279552
Subject(s) - gesture , table (database) , computer science , human–computer interaction , utterance , multimedia , space (punctuation) , multimodal interaction , speech recognition , artificial intelligence , database , operating system
There is a large disparity between the rich physical interfaces of co-located arcade games and the generic input devices seen in most home console systems. In this article we argue that a digital table is a conducive form factor for general co-located home gaming as it affords: (a) seating in collaboratively relevant positions that give all equal opportunity to reach into the surface and share a common view; (b) rich whole-handed gesture input usually seen only when handling physical objects; (c) the ability to monitor how others use space and access objects on the surface; and (d) the ability to communicate with each other and interact on top of the surface via gestures and verbal utterance. Our thesis is that multimodal gesture and speech input benefits collaborative interaction over such a digital table. To investigate this thesis, we designed a multimodal, multiplayer gaming environment that allows players to interact directly atop a digital table via speech and rich whole-hand gestures. We transform two commercial single-player computer games, representing a strategy and simulation game genre, to work within this setting.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom