z-logo
open-access-imgOpen Access
The Deep Neural Network and Content Transcription-based Speech Recognition Algorithm in Keyword Detection
Author(s) -
Lei Zheng
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1544/1/012150
Subject(s) - hidden markov model , speech recognition , computer science , artificial neural network , artificial intelligence , pattern recognition (psychology) , acoustic model , mixture model , speech processing
Objective: It is hoped that the DNN-Hidden Markov Model (HMM) acoustic model and the long-short-term memory neural network-Hidden Markov (LSTM-HMM) acoustic model can be used to improve the accuracy rate of speech recognition in keyword detection. Method: First, the principles of speech recognition and related algorithms are introduced. Then, a DNN algorithm is applied to the speech recognition system, and a keyword detection system is established based on the DNN-HMM acoustic model. As for the experimental comparison, the proposed DNN model is used for acoustic modeling, and the effect of the DNN algorithm on the performance of the recognition system is analyzed experimentally. Results: The size of the training parameters of the proposed LSTM model is 436570, the size of the training parameters of the deep neural network (DNN) is 698100, and the size of the training parameters of the Gaussian mixture model (GMM) is 1226700. The accuracy of speech recognition based on the LSTM-HMM model and DNN-HMM model is 965% and 91.6%, respectively, which is significantly higher than 78.5% of the traditional speech recognition model (GMM-HMM). Conclusion: The speech recognition technology based on LSTM-HMM and DNN-HMM models has higher accuracy and is more suitable for speech keyword detection.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here