
Optimization of Irrelevant Features for Brain-Computer Interface (BCI) System
Author(s) -
Ong Zhi Ying,
Saidatul Ardeenawatie Awang,
A Vikneswaran,
L Vijean
Publication year - 2019
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1372/1/012047
Subject(s) - sample entropy , brain–computer interface , computer science , detrended fluctuation analysis , pattern recognition (psychology) , artificial intelligence , feature selection , entropy (arrow of time) , approximate entropy , classifier (uml) , speech recognition , mathematics , electroencephalography , psychology , physics , geometry , quantum mechanics , psychiatry , scaling
The brain is the most important body part for human. The brain controls all activities of the body such as movement, imagine, and response. Therefore, it is believed that the signals which collected from human scalp contain a lot of useful information. This useful information known as features can be extracted by applying advanced signal processing. Then, the features used for a brain-computer interface (BCI) system. However, the most suitable and relevant features for the BCI system still not investigate. In this paper, ten healthy subjects were involved in data collection. Threshold method, notch filter and wavelet decomposition were applied during pre-processing. Then, the signals were normalised. Hilbert-Huang Transform (HHT) and Power Spectral Density (PSD) were implemented. The features such as statistical-based features, approximate entropy (ApEn), sample entropy (SampEn), fuzzy entropy (FuzEn), permutation entropy (PermEn), distribution entropy (DistEn), Hjorth parameter, and Hurst exponent (HE) were extracted from PSD and HHT separately. Genetic algorithm (GA) and reliefF were carried out to select the most suitable and relevant features for the BCI system. The prediction rate before and after feature selection were compared. The performance after feature selection is improved in term of prediction rate and training time. The best classifier, in this case, is the bagged tree which can achieve 99.30%.