z-logo
open-access-imgOpen Access
Multimodal Indices to Japanese and French Prosodically Expressed Social Affects
Author(s) -
Albert Rilliard,
Takaaki Shochi,
JeanClaude Martin,
Donna Erickson,
Véronique Aubergé
Publication year - 2009
Publication title -
language and speech
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.713
H-Index - 52
eISSN - 1756-6053
pISSN - 0023-8309
DOI - 10.1177/0023830909103171
Subject(s) - psychology
Whereas several studies have explored the expression of emotions, little is known on how the visual and audio channels are combined during production of what we call the more controlled social affects, for example, "attitudinal" expressions. This article presents a perception study of the audovisual expression of 12 Japanese and 6 French attitudes in order to understand the contribution of audio and visual modalities for affective communication. The relative importance of each modality in the perceptual decoding of the expressions of four speakers is analyzed as a first step towards a deeper comprehension of their influence on the expression of social affects. Then, the audovisual productions of two speakers (one for each language) are acoustically (F0, duration and intensity) and visually (in terms of Action Units) analyzed, in order to match the relation between objective parameters and listeners' perception of these social affects. The most pertinent objective features, either acoustic or visual, are then discussed, in a bilingual perspective: for example, the relative influence of fundamental frequency for attitudinal expression in both languages is discussed, and the importance of a certain aspect of the voice quality dimension in Japanese is underlined.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom