
Single Document Extractive Summarization Model Based on Heterogeneous Graph Transformer
Author(s) -
Gan Liu,
Peng He
Publication year - 2022
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2171/1/012012
Subject(s) - computer science , automatic summarization , graph , transformer , generality , artificial intelligence , data mining , theoretical computer science , psychology , physics , quantum mechanics , voltage , psychotherapist
At present, the graph model-based summary model has problems such as insufficient semantic fusion between nodes and lack of location information. Therefore, this paper proposes a single-document extraction text summary model based on a heterogeneous graph attention neural network, using HGT (Heterogeneous Graph Transformer), Heterogeneous Graph Attention Neural Network to solve the defect of insufficient deep semantic fusion of nodes, and use trainable position coding to solve the defect of missing position information. Experiments show that the model in this paper has improved on the three evaluation indicators of R_1, R_2 and R_L, and the abstracts extracted have better generality.