z-logo
open-access-imgOpen Access
Entity Linking Based on Sentence Representation
Author(s) -
Bingjing Jia,
Zhongli Wu,
Pengpeng Zhou,
Bin Wu
Publication year - 2021
Publication title -
complexity
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.447
H-Index - 61
eISSN - 1099-0526
pISSN - 1076-2787
DOI - 10.1155/2021/8895742
Subject(s) - computer science , sentence , natural language processing , representation (politics) , artificial intelligence , similarity (geometry) , meaning (existential) , mechanism (biology) , knowledge base , image (mathematics) , psychology , philosophy , epistemology , politics , political science , law , psychotherapist
Entity linking involves mapping ambiguous mentions in documents to the correct entities in a given knowledge base. Most existing methods failed to link when a mention appears multiple times in a document, since the conflict of its contexts in different locations may lead to difficult linking. Sentence representation, which has been studied based on deep learning approaches recently, can be used to resolve the above issue. In this paper, an effective entity linking model is proposed to capture the semantic meaning of the sentences and reduce the noise introduced by different contexts of the same mention in a document. This model first uses the symmetry of the Siamese network to learn the sentence similarity. Then, the attention mechanism is added to improve the interaction between input sentences. To show the effectiveness of our sentence representation model combined with attention mechanism, named ELSR, extensive experiments are conducted on two public datasets. Results illustrate that our model outperforms the baselines and achieves the superior performance.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom