z-logo
open-access-imgOpen Access
Speech Emotion Recognition
Author(s) -
A Swethashree
Publication year - 2021
Publication title -
international journal for research in applied science and engineering technology
Language(s) - English
Resource type - Journals
ISSN - 2321-9653
DOI - 10.22214/ijraset.2021.37375
Subject(s) - disgust , mel frequency cepstrum , computer science , speech recognition , classifier (uml) , feeling , python (programming language) , emotion classification , perceptron , happiness , artificial intelligence , feature extraction , artificial neural network , psychology , anger , social psychology , operating system
Abstract: Speech Emotion Recognition, abbreviated as SER, the act of trying to identify a person's feelings and relationships. Affected situations from speech. This is because the truth often reflects the basic feelings of tone and tone of voice. Emotional awareness is a fast-growing field of research in recent years. Unlike humans, machines do not have the power to comprehend and express emotions. But human communication with the computer can be improved by using automatic sensory recognition, accordingly reducing the need for human intervention. In this project, basic emotions such as peace, happiness, fear, disgust, etc. are analyzed signs of emotional expression. We use machine learning techniques such as Multilayer perceptron Classifier (MLP Classifier) which is used to separate information provided by groups to be divided equally. Coefficients of Mel-frequency cepstrum (MFCC), chroma and mel features are extracted from speech signals and used to train MLP differentiation. By accomplishing this purpose, we use python libraries such as Librosa, sklearn, pyaudio, numpy and audio file to analyze speech patterns and see the feeling. Keywords: Speech emotion recognition, mel cepstral coefficient, neural artificial network, multilayer perceptrons, mlp classifier, python.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here