z-logo
Premium
Forest optimization algorithm‐based feature selection using classifier ensemble
Author(s) -
Moorthy Usha,
Gandhi Usha Devi
Publication year - 2020
Publication title -
computational intelligence
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.353
H-Index - 52
eISSN - 1467-8640
pISSN - 0824-7935
DOI - 10.1111/coin.12265
Subject(s) - computer science , feature selection , artificial intelligence , naive bayes classifier , classifier (uml) , support vector machine , pattern recognition (psychology) , decision tree , ensemble learning , k nearest neighbors algorithm , machine learning , statistical classification , algorithm , data mining
Features selection is the process of choosing the relevant subset of features from the high‐dimensional dataset to enhance the performance of the classifier. Much research has been carried out in the present world for the process of feature selection. Algorithms such as Naïve Bayes (NB), decision tree, and genetic algorithm are applied to the high‐dimensional dataset to select the relevant features and also to increase the computational speed. The proposed model presents a solution for selection of features using ensemble classifier algorithms. The proposed algorithm is the combination of minimum redundancy and maximum relevance (mRMR) and forest optimization algorithm (FOA). Ensemble‐based algorithms such as support vector machine (SVM), K‐nearest neighbor (KNN), and NB is further used to enhance the performance of the classifier algorithm. The mRMR‐FOA is used to select the relevant features from the various datasets and 21% to 24% improvement is recorded in the feature selection. The ensemble classifier algorithms further improves the performance of the algorithm and provides accuracy of 96%.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here