z-logo
open-access-imgOpen Access
Hybrid attention mechanism for few‐shot relational learning of knowledge graphs
Author(s) -
Ma Ruixin,
Li Zeyang,
Guo Fangqing,
Zhao Liang
Publication year - 2021
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/cvi2.12066
Subject(s) - computer science , knowledge graph , artificial intelligence , embedding , graph , machine learning , matching (statistics) , statistical relational learning , theoretical computer science , data mining , relational database , mathematics , statistics
Few‐shot knowledge graph (KG) reasoning is the main focus in the field of knowledge graph reasoning. In order to expand the application fields of the knowledge graph, a large number of studies are based on a large number of training samples. However, we have learnt that there are actually many missing relationships or entities in the knowledge graph, and in most cases, there are not many training instances when implementing new relationships. To tackle it, in this study, the authors aim to predict a new entity given few reference instances, even only one training instance. A few‐shot learning framework based on a hybrid attention mechanism is proposed. The framework employs traditional embedding models to extract knowledge, and uses an attenuated attention network and a self‐attention mechanism to obtain the hidden attributes of entities. Thus, it can learn a matching metric by considering both the learnt embeddings and one‐hop graph structures. The experimental results present that the model has achieved significant performance improvements on the NELL‐One and Wiki‐One datasets.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here