
Residual Fusion Network Based Attention for Aspect Level Sentiment Analysis
Author(s) -
Long Wang,
Lijiang Chen,
Linghan Cai,
Xia Mao
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1673/1/012064
Subject(s) - computer science , residual , sentence , sentiment analysis , artificial intelligence , natural language processing , classifier (uml) , crfs , laptop , transformer , encode , conditional random field , algorithm , physics , quantum mechanics , voltage , operating system , biochemistry , chemistry , gene
Aspect level sentiment analysis is a more fine-grain task compared to document level and sentence level. The aim of the task is to identify the sentiment polarity expressed to the given aspect in the text. In this paper, we propose a residual aspect fusion network with attention for the aspect level sentiment classification. In this network, bidirectional GRU is used to collect semantic information of both sentence words and aspect words, and then attention mechanism is applied to construct aspect vectors for each word in the sentence. In the residual fusion module, the model fuses the aspect information and sentence information together by residual connection structure, in which position weight is applied to utilize the position information of aspect words. At the end of the model, Transformer encode layer is used to extract global feature and fully connected layers are used as the classifier. We train and test the network on SemEval2014 task 4 (Restaurant and Laptop) and Twitter datasets, and the experiments demonstrate that our model performs better than previous methods.