Open Access
Sentiment analysis based on BiGRU information enhancement
Author(s) -
Xing Yin,
Changhui Liu,
Fang Xiao-dong
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1748/3/032054
Subject(s) - computer science , layer (electronics) , feature (linguistics) , set (abstract data type) , artificial intelligence , stacking , linear subspace , artificial neural network , field (mathematics) , data set , data mining , training set , pattern recognition (psychology) , mathematics , philosophy , linguistics , chemistry , physics , geometry , organic chemistry , nuclear magnetic resonance , pure mathematics , programming language
In view of the existing sentiment analysis technology, model design is often a combination of different network models, which may not give full play to the advantages of the network. This paper proposes a data pre-training based on BERT, and input the obtained data into BiGRU neural network layer to enhance the features, and achieve the feature enhancement by stacking and repeated use. The input of the second layer is the result of the output of the first layer, and the input of the third layer is the result of the output of the second layer, and then it is calculated multiple times in different subspaces to learn relevant information. The experiment was conducted on the Twitter data set, and the results showed that in this idea, The two-layer BiGRU classification effect is better, reaching an accuracy of 82.63%, the convergence rate is faster, and it is better than other classification models, and the classification work can be completed well.