z-logo
open-access-imgOpen Access
LDA filter: A Latent Dirichlet Allocation preprocess method for Weka
Author(s) -
P. Celard,
A. Seara Vieira,
Eva Iglesias,
L. Borrajo
Publication year - 2020
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0241701
Subject(s) - latent dirichlet allocation , computer science , artificial intelligence , filter (signal processing) , naive bayes classifier , support vector machine , representation (politics) , topic model , dirichlet distribution , pattern recognition (psychology) , bag of words model , machine learning , set (abstract data type) , data mining , mathematics , mathematical analysis , programming language , politics , political science , law , computer vision , boundary value problem
This work presents an alternative method to represent documents based on LDA (Latent Dirichlet Allocation) and how it affects to classification algorithms, in comparison to common text representation. LDA assumes that each document deals with a set of predefined topics, which are distributions over an entire vocabulary. Our main objective is to use the probability of a document belonging to each topic to implement a new text representation model. This proposed technique is deployed as an extension of the Weka software as a new filter. To demonstrate its performance, the created filter is tested with different classifiers such as a Support Vector Machine (SVM), k-Nearest Neighbors (k-NN), and Naive Bayes in different documental corpora (OHSUMED, Reuters-21578, 20Newsgroup, Yahoo! Answers, YELP Polarity, and TREC Genomics 2015). Then, it is compared with the Bag of Words (BoW) representation technique. Results suggest that the application of our proposed filter achieves similar accuracy as BoW but greatly improves classification processing times.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here