z-logo
open-access-imgOpen Access
A PSO-based multi-objective multi-label feature selection method in classification
Author(s) -
Yong Zhang,
Dunwei Gong,
Xiaoyan Sun,
Yinan Guo
Publication year - 2017
Publication title -
scientific reports
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.24
H-Index - 213
ISSN - 2045-2322
DOI - 10.1038/s41598-017-00416-0
Subject(s) - particle swarm optimization , feature selection , computer science , preprocessor , multi label classification , pareto principle , feature (linguistics) , artificial intelligence , swarm behaviour , selection (genetic algorithm) , pattern recognition (psychology) , set (abstract data type) , range (aeronautics) , exploit , data mining , machine learning , mathematical optimization , mathematics , engineering , linguistics , philosophy , programming language , computer security , aerospace engineering
Feature selection is an important data preprocessing technique in multi-label classification. Although a large number of studies have been proposed to tackle feature selection problem, there are a few cases for multi-label data. This paper studies a multi-label feature selection algorithm using an improved multi-objective particle swarm optimization (PSO), with the purpose of searching for a Pareto set of non-dominated solutions (feature subsets). Two new operators are employed to improve the performance of the proposed PSO-based algorithm. One operator is adaptive uniform mutation with action range varying over time, which is used to extend the exploration capability of the swarm; another is a local learning strategy, which is designed to exploit the areas with sparse solutions in the search space. Moreover, the idea of the archive, and the crowding distance are applied to PSO for finding the Pareto set. Finally, experiments verify that the proposed algorithm is a useful approach of feature selection for multi-label classification problem.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here