Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
Author(s) -
Li Zhang,
Bryan Yap
Publication year - 2012
Publication title -
advances in human-computer interaction
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.429
H-Index - 21
eISSN - 1687-5907
pISSN - 1687-5893
DOI - 10.1155/2012/461247
Subject(s) - affect (linguistics) , improvisation , gesture , computer science , dialog box , interpretation (philosophy) , human–computer interaction , semantic interpretation , sentence , cognitive psychology , natural language processing , psychology , artificial intelligence , communication , art , world wide web , visual arts , programming language
We have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators also contain emotional implication but were regarded as neutral expressions by the previous interpretation. In this paper, we employ latent semantic analysis to perform topic theme detection and identify target audiences for such inputs. We also discuss how such semantic interpretation of the dialog contexts is used to interpret affect more appropriately during virtual improvisation. Also, in order to build a reliable affect analyser, it is important to detect and combine weak affect indicators from other channels such as body language. Such emotional body language detection also provides a nonintrusive channel to detect users’ experience without interfering with the primary task. Thus, we also make initial exploration on affect detection from several universally accepted emotional gestures
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom