z-logo
open-access-imgOpen Access
Learning to Complete Knowledge Graphs with Deep Sequential Models
Author(s) -
Lingbing Guo,
Qinghe Zhang,
Wei Hu,
Zequn Sun,
Yuzhong Qu
Publication year - 2019
Publication title -
data intelligence
Language(s) - English
Resource type - Journals
eISSN - 2096-7004
pISSN - 2641-435X
DOI - 10.1162/dint_a_00016
Subject(s) - benchmark (surveying) , relation (database) , computer science , head (geology) , artificial intelligence , task (project management) , knowledge graph , graph , deep learning , artificial neural network , algorithm , recurrent neural network , machine learning , natural language processing , theoretical computer science , data mining , management , geodesy , geomorphology , geology , economics , geography
Knowledge graph (KG) completion aims at filling the missing facts in a KG, where a fact is typically represented as a triple in the form of (head, relation, tail). Traditional KG completion methods compel two-thirds of a triple provided (e.g., head and relation) to predict the remaining one. In this paper, we propose a new method that extends multi-layer recurrent neural networks (RNNs) to model triples in a KG as sequences. It obtains state-of-the-art performance on the common entity prediction task, i.e., giving head (or tail) and relation to predict the tail (or the head), using two benchmark data sets. Furthermore, the deep sequential characteristic of our method enables it to predict the relations given head (or tail) only, and even predict the whole triples. Our experiments on these two new KG completion tasks demonstrate that our method achieves superior performance compared with several alternative methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom