z-logo
open-access-imgOpen Access
TWE‐WSD: An effective topical word embedding based word sense disambiguation
Author(s) -
Jia Lianyin,
Tang Jilin,
Li Mengjuan,
You Jinguo,
Ding Jiaman,
Chen Yig
Publication year - 2021
Publication title -
caai transactions on intelligence technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.613
H-Index - 15
ISSN - 2468-2322
DOI - 10.1049/cit2.12006
Subject(s) - polysemy , semeval , computer science , word embedding , word (group theory) , natural language processing , artificial intelligence , noun , embedding , semantics (computer science) , linguistics , philosophy , management , programming language , economics , task (project management)
Word embedding has been widely used in word sense disambiguation (WSD) and many other tasks in recent years for it can well represent the semantics of words. However, the existing word embedding methods mostly represent each word as a single vector, without considering the homonymy and polysemy of the word; thus, their performances are limited. In order to address this problem, an effective topical word embedding (TWE)‐based WSD method, named TWE‐WSD, is proposed, which integrates Latent Dirichlet Allocation (LDA) and word embedding. Instead of generating a single word vector (WV) for each word, TWE‐WSD generates a topical WV for each word under each topic. Effective integrating strategies are designed to obtain high quality contextual vectors. Extensive experiments on SemEval‐2013 and SemEval‐2015 for English all‐words tasks showed that TWE‐WSD outperforms other state‐of‐the‐art WSD methods, especially on nouns.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here