z-logo
open-access-imgOpen Access
A Scalable Attention Mechanism Based Neural Network for Text Classification
Author(s) -
Jianyun Zheng,
Jianmin Pang,
Xiaochuan Zhang,
Di Sun,
Xin Zhou,
Kai Zhang,
Dong Wang,
MingLiang Li,
Jun Wang
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1486/2/022019
Subject(s) - computer science , scalability , mechanism (biology) , artificial intelligence , artificial neural network , machine learning , deep learning , deep neural networks , database , philosophy , epistemology
In general, deep learning based text classification methods are considered to be effective but tend to be relatively slow especially for model training. In this work, we present a powerful, so-called “scalable attention mechanism”, which performs better than conventional attention mechanism in terms of both effectiveness and the speed of model training. Based on the scalable attention mechanism, we propose a neural network for text classification. The experimental results on eight representative datasets show that our method can obtain similar accuracy to state-of-the-art methods with training in less than 4 minutes on an NVIDIA GTX 1080Ti GPU. To the best of our knowledge, our method is at least twice faster than all the published deep learning classifiers.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here