z-logo
open-access-imgOpen Access
Faster Zero-shot Multi-modal Entity Linking via Visual-Linguistic Representation
Author(s) -
Qiushuo Zheng,
Hao Wen,
Meng Wang,
Guilin Qi,
Chaoyu Bai
Publication year - 2022
Publication title -
data intelligence
Language(s) - English
Resource type - Journals
eISSN - 2096-7004
pISSN - 2641-435X
DOI - 10.1162/dint_a_00146
Subject(s) - computer science , modal , inference , task (project management) , representation (politics) , natural language processing , artificial intelligence , graph , information retrieval , theoretical computer science , chemistry , management , politics , political science , polymer chemistry , law , economics
Multi-modal entity linking plays a crucial role in a wide range of knowledgebased modal-fusion tasks, i.e., multi-modal retrieval and multi-modal event extraction. We introduce the new ZEro-shot Multi-modal Entity Linking (ZEMEL) task, the format is similar to multi-modal entity linking, but multi-modal mentions are linked to unseen entities in the knowledge graph, and the purpose of zero-shot setting is to realize robust linking in highly specialized domains. Simultaneously, the inference efficiency of existing models is low when there are many candidate entities. On this account, we propose a novel model that leverages visual-linguistic representation through the co-attentional mechanism to deal with the ZEMEL task, considering the trade-off between performance and efficiency of the model. We also build a dataset named ZEMELD for the new task, which contains multi-modal data resources collected from Wikipedia, and we annotate the entities as ground truth. Extensive experimental results on the dataset show that our proposed model is effective as it significantly improves the precision from 68.93% to 82.62% comparing with baselines in the ZEMEL task.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom