z-logo
open-access-imgOpen Access
A Word Selection Method for Producing Interpretable Distributional Semantic Word Vectors
Author(s) -
Atefe Pakzad,
Morteza Analoui
Publication year - 2021
Publication title -
journal of artificial intelligence research/the journal of artificial intelligence research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.79
H-Index - 123
eISSN - 1943-5037
pISSN - 1076-9757
DOI - 10.1613/jair.1.13353
Subject(s) - interpretability , word (group theory) , word embedding , computer science , artificial intelligence , selection (genetic algorithm) , semantic similarity , natural language processing , basis (linear algebra) , similarity (geometry) , space (punctuation) , feature vector , embedding , pattern recognition (psychology) , mathematics , geometry , image (mathematics) , operating system
Distributional semantic models represent the meaning of words as vectors. We introduce a selection method to learn a vector space that each of its dimensions is a natural word. The selection method starts from the most frequent words and selects a subset, which has the best performance. The method produces a vector space that each of its dimensions is a word. This is the main advantage of the method compared to fusion methods such as NMF, and neural embedding models. We apply the method to the ukWaC corpus and train a vector space of N=1500 basis words. We report tests results on word similarity tasks for MEN, RG-65, SimLex-999, and WordSim353 gold datasets. Also, results show that reducing the number of basis vectors from 5000 to 1500 reduces accuracy by about 1.5-2%. So, we achieve good interpretability without a large penalty. Interpretability evaluation results indicate that the word vectors obtained by the proposed method using N=1500 are more interpretable than word embedding models, and the baseline method. We report the top 15 words of 1500 selected basis words in this paper.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here