z-logo
open-access-imgOpen Access
An Interactive System with Facial Expression Recognition
Author(s) -
Yuyi Shang,
Mie Sato,
Masao Kasuga
Publication year - 2005
Publication title -
journal of advanced computational intelligence and intelligent informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 20
eISSN - 1343-0130
pISSN - 1883-8014
DOI - 10.20965/jaciii.2005.p0637
Subject(s) - computer science , sadness , surprise , facial expression , focus (optics) , speech recognition , expression (computer science) , emotion classification , anger , speaker recognition , motion (physics) , artificial intelligence , psychology , social psychology , physics , psychiatry , optics , programming language
To make communication between users and machines more comfortable, we focus on facial expressions and automatically classify them into 4 expression candidates: “joy,” “anger, ” “sadness,” and “surprise.” The classification uses features that correspond to expression-motion patterns, and then voice data is output based on classification results. When we output voice data, insufficiency in classification is taken into account. We choose the first and second expression candidates from classification results. To realize interactive communication between users and machines, information on these candidates is used when we access a voice database. The voice database contains voice data corresponding to emotions.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom