z-logo
open-access-imgOpen Access
Fast Conditional Independence-based Bayesian Classifier
Author(s) -
Estevam R. Hruschka,
Sebastian D. C. de O. Galvão
Publication year - 2007
Publication title -
journal of computing science and engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 16
eISSN - 2093-8020
pISSN - 1976-4677
DOI - 10.5626/jcse.2007.1.2.162
Subject(s) - computer science , markov blanket , artificial intelligence , machine learning , naive bayes classifier , conditional independence , bayesian network , classifier (uml) , data mining , markov chain , markov model , support vector machine , markov property
Machine Learning (ML) has become very popular within Data Mining (KDD) and Artificial Intelligence (AI) research and their applications. In the ML and KDD contexts, two main approaches can be used for inducing a Bayesian Network (BN) from data, namely, Conditional Independence (CI) and the Heuristic Search (HS). When a BN is induced for classification purposes (Bayesian Classifier - BC), it is possible to impose some specific constraints aiming at increasing the computational efficiency. In this paper a new CI based approach to induce BCs from data is proposed and two algorithms are presented. Such approach is based on the Markov Blanket concept in order to impose some constraints and optimize the traditional PC learning algorithm. Experiments performed with the ALARM, as well as other six UCI and three artificial domains revealed that the proposed approach tends to execute fewer comparison tests than the traditional PC. The experiments also show that the proposed algorithms produce competitive classification rates when compared with both, PC and Naive Bayes.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom