z-logo
Premium
Combining and learning word embedding with WordNet for semantic relatedness and similarity measurement
Author(s) -
Lee YangYin,
Ke Hao,
Yen TingYu,
Huang HenHsen,
Chen HsinHsi
Publication year - 2020
Publication title -
journal of the association for information science and technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.903
H-Index - 145
eISSN - 2330-1643
pISSN - 2330-1635
DOI - 10.1002/asi.24289
Subject(s) - wordnet , word embedding , benchmark (surveying) , computer science , semantic similarity , artificial intelligence , word (group theory) , natural language processing , embedding , similarity (geometry) , measure (data warehouse) , information retrieval , mathematics , data mining , image (mathematics) , geometry , geodesy , geography
In this research, we propose 3 different approaches to measure the semantic relatedness between 2 words: (i) boost the performance of GloVe word embedding model via removing or transforming abnormal dimensions; (ii) linearly combine the information extracted from WordNet and word embeddings; and (iii) utilize word embedding and 12 linguistic information extracted from WordNet as features for Support Vector Regression. We conducted our experiments on 8 benchmark data sets, and computed Spearman correlations between the outputs of our methods and the ground truth. We report our results together with 3 state‐of‐the‐art approaches. The experimental results show that our method can outperform state‐of‐the‐art approaches in all the selected English benchmark data sets.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here