z-logo
open-access-imgOpen Access
Speech emotion recognition based on brain and mind emotional learning model
Author(s) -
Sara Motamed,
Saeed Setayeshi,
Azam Rabiee
Publication year - 2018
Publication title -
journal of integrative neuroscience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.336
H-Index - 33
eISSN - 1757-448X
pISSN - 0219-6352
DOI - 10.3233/jin-180088
Subject(s) - computer science , speech recognition , noise (video) , safer , obstacle , emotion recognition , long short term memory , artificial intelligence , cognitive psychology , psychology , artificial neural network , law , computer security , political science , recurrent neural network , image (mathematics)
Speech emotion recognition is a challenging obstacle to enabling communication between humans and machines. The present study introduces a new model of speech emotion recognition based on the relationship between the human brain and mind. According to this relationship, the proposed model consists of two parts: the brain short term memory (BSTM) and mind long term memory (MLTM). The input of the BSTM is emotional speech signals. Then, this part gives one copy of information to the MLTM. The reason is that the brain needs to save information as knowledge in a bigger and safer place similar to the human mind. The proposed model not only provides a computational model of speech emotion recognition based on the relationship between the BSTM and MLTM, but also illustrates a new relationship between brain and mind. The proposed model has been compared with different models of recognition. As the aim is to prove the efficiency of the suggested model, the effect of noise with different noise rates on the input signals has been analyzed in the experiment part. Experimental results show that the proposed algorithm has a powerful capability to identify and explore human emotion even in the noisy environment.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here