Emotional Speech of Mentally and Physically Disabled Individuals: Introducing the EmotAsS Database and First Findings
Author(s) -
Simone Hantke,
Hesam Sagha,
Nicholas Cummins,
Björn W. Schuller
Publication year - 2017
Publication title -
interspeech 2022
Language(s) - English
Resource type - Conference proceedings
DOI - 10.21437/interspeech.2017-409
Subject(s) - mentally retarded , computer science , speech recognition , psychology , developmental psychology
The automatic recognition of emotion from speech is a mature research field with a large number of publicly available corpora. However, to the best of the authors knowledge, none of these datasets consist solely of emotional speech samples from individuals with mental, neurological and/or physical disabilities. Yet, such individuals could benefit from speech-based assistive technologies to enhance their communication with their environment and to manage their daily work process. With the aim of advancing these technologies, we fill this void in emotional speech resources by introducing the EmotAsS (Emotional Sensitivity Assistance System for People with Disabilities) corpus consisting of spontaneous emotional German speech data recorded from 17 mentally, neurologically and/or physically disabled participants in their daily work environment, resulting in just under 11 hours of total speech time and featuring approximately 12.7 k utterances after segmentation. Transcription was performed and labelling was carried out in seven emotional categories, as well as for the intelligibility of the speaker. We present a set of baseline results, based on using standard acoustic and linguistic features, for arousal and valence emotion recognition.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom