An Efficient Feature Selection based on Bayes Theorem, Self Information and Sequential Forward Selection
Author(s) -
K. Mani,
P. Kalpana
Publication year - 2016
Publication title -
international journal of information engineering and electronic business
Language(s) - English
Resource type - Journals
eISSN - 2074-9023
pISSN - 2074-9031
DOI - 10.5815/ijieeb.2016.06.06
Subject(s) - feature selection , computer science , artificial intelligence , naive bayes classifier , weighting , preprocessor , bayes' theorem , machine learning , classifier (uml) , feature (linguistics) , relevance (law) , pattern recognition (psychology) , selection (genetic algorithm) , bayesian programming , bayesian probability , data mining , support vector machine , bayes factor , linguistics , philosophy , medicine , political science , law , radiology
Feature selection is an indispensable preprocessing technique for selecting more relevant features and eradicating the redundant attributes. Finding the more relevant features for the target is an essential activity to improve the predictive accuracy of the learning algorithms because more irrelevant features in the original feature space will cause more classification errors and consume more time for learning. Many methods have been proposed for feature relevance analysis but no work has been done using Bayes Theorem and Self Information. Thus this paper has been initiated to introduce a novel integrated approach for feature weighting using the measures viz., Bayes Theorem and Self Information and picks the high weighted attributes as the more relevant features using Sequential Forward Selection. The main objective of introducing this approach is to enhance the predictive accuracy of the Naive Bayesian Classifier.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom