z-logo
open-access-imgOpen Access
A Hybrid GCN and RNN Structure Based on Attention Mechanism for Text Classification
Author(s) -
Lingchao Gao,
Jiakai Wang,
Zhixian Pi,
Huaixun Zhang,
Xiao Yang,
Peizhuo Huang,
Jing Sun
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1575/1/012130
Subject(s) - recurrent neural network , computer science , artificial intelligence , graph , convolutional neural network , embedding , network structure , artificial neural network , deep learning , time delay neural network , layer (electronics) , field (mathematics) , pattern recognition (psychology) , machine learning , theoretical computer science , chemistry , mathematics , organic chemistry , pure mathematics
In the field of deep learning, for problems and tasks that are sensitive to time series, such as natural language processing or speech recognition, the recurrent neural network is usually more suitable. Long short-term memory (LSTM) is a representative network structure in recurrent neural network. It is time-dependent and enables a global representation of features. However, some problems such as the network parameters of LSTMs limit the applicability of their solutions. This paper proposes an improved hybrid structure of graph convolutional neural network and recurrent neural network. In the input layer, a two-dimensional convolutional neural network is used to build a text corpus map. Graphic embedding is used to preserve the global structure of the entire text graph structures. The LSTM layer and attention mechanism are used to fully implement text classification and improve the computational efficiency. The test results show that the hybrid network structure has better operation speed on the IMDb dataset.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here