z-logo
open-access-imgOpen Access
Affective Interaction in Smart Environments
Author(s) -
Maurizio Caon,
Leonardo Angelini,
Omar Abou Khaled,
Denis Lalanne,
Yong Yue,
Elena Mugellini
Publication year - 2014
Publication title -
procedia computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.334
H-Index - 76
ISSN - 1877-0509
DOI - 10.1016/j.procs.2014.05.527
Subject(s) - computer science , human–computer interaction , modalities , facial expression , context (archaeology) , event (particle physics) , natural (archaeology) , painting , rgb color model , smart environment , modality (human–computer interaction) , multimedia , artificial intelligence , world wide web , internet of things , history , paleontology , social science , art , physics , archaeology , quantum mechanics , sociology , visual arts , biology
We present a concept where the smart environments of the future will be able to provide ubiquitous affective communication. All the surfaces will become interactive and the furniture will display emotions. In particular, we present a first prototype that allows people to share their emotional states in a natural way. The input will be given through facial expressions and the output will be displayed in a context-aware multimodal way. Two novel output modalities are presented: a robotic painting that applies the concept of affective communication to the informative art and an RGB lamp that represents the emotions remaining in the user's peripheral attention. An observation study has been conducted during an interactive event and we report our preliminary findings in this paper

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom