z-logo
open-access-imgOpen Access
Deep Pyramid Convolutional Neural Network Integrated with Self-attention Mechanism and Highway Network for Text Classification
Author(s) -
Xuewei Li,
Hongyun Ning
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1642/1/012008
Subject(s) - computer science , convolutional neural network , artificial intelligence , pyramid (geometry) , deep learning , feature (linguistics) , mechanism (biology) , artificial neural network , feature extraction , pattern recognition (psychology) , machine learning , natural language processing , linguistics , philosophy , physics , epistemology , optics
Text classification is one of the basic tasks of natural language processing. In recent years, deep learning has been widely used in text classification tasks. The representative one is the convolutional neural network. DPCNN is a deep convolutional neural network text classification model that can obtain long-distance text information, but it focuses on the extraction of global features and ignores the extraction of local features of the text. Some local feature information is very important and plays an important role in text classification tasks. Therefore, in this paper, a self-attention mechanism is introduced to extract local features of text based on the lack of extracting local features of DPCNN. In addition, although the deep convolutional neural network can extract deeper features, it is easy to cause the problem of gradient disappearance during training. Therefore, the highway network is introduced to prevent the problem of gradient disappearance caused by network training and improve the performance of the model. Experimental results show that the proposed model is better than a single DPCNN model, which further improves the accuracy of text classification tasks.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here