
Hierarchical graph attention network for relation extraction
Author(s) -
Hongru Sun,
Wancheng Ni,
Yiping Yang
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1780/1/012030
Subject(s) - computer science , dependency (uml) , dependency graph , relationship extraction , graph , artificial intelligence , tree (set theory) , pooling , relation (database) , security token , theoretical computer science , data mining , mathematics , mathematical analysis , computer security
Previous research on relation extraction has proved the effectiveness of using dependency trees, which build non-local connections between tokens. However, the existing dependency-based models treat the dependency trees as an inherently flat graph, which causes a loss of dependency information for representing sentences. Besides, they fail to consider the fact that the importance of tokens on the dependency tree varies with different relations to be extracted. In this paper, we propose a novel hierarchical graph attention network HierGAT , which can generate multi-level dependency trees and extract key information from them to improve relation extraction. Specifically, it contains multiple dependency-based graph attention layers, each of which takes a different dependency tree generated by an adaptive subtree pruning strategy as input, and distinguishes the importance of different tokens in a dependency tree. Finally, HierGAT integrates the output token representations of each layer with a multi-head attention mechanism, and learns sentence representation for relation extraction through a pooling layer. The experimental results demonstrate that our method outperforms state-of-the-art baselines on the benchmark datasets.