
D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction
Author(s) -
Huang Yuan,
Li Zhixing,
Deng Wei,
Wang Guoyin,
Lin Zhimin
Publication year - 2021
Publication title -
caai transactions on intelligence technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.613
H-Index - 15
ISSN - 2468-2322
DOI - 10.1049/cit2.12033
Subject(s) - relationship extraction , computer science , transformer , relation (database) , natural language processing , dependency (uml) , artificial intelligence , encoder , information extraction , granularity , representation (politics) , word (group theory) , dependency grammar , data mining , linguistics , engineering , programming language , philosophy , voltage , politics , law , political science , operating system , electrical engineering
Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level syntactic features that consider the dependency between each word and the target entities into the pre‐trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi‐granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.