z-logo
open-access-imgOpen Access
Integration of Decision Trees Using Distance to Centroid and to Decision Boundary
Author(s) -
Jędrzej Biedrzycki,
Robert Burduk
Publication year - 2020
Publication title -
jucs - journal of universal computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.284
H-Index - 53
eISSN - 0948-695X
pISSN - 0948-6968
DOI - 10.3897/jucs.2020.038
Subject(s) - centroid , benchmarking , decision tree , artificial intelligence , decision boundary , computer science , pattern recognition (psychology) , classifier (uml) , voting , random subspace method , majority rule , homogeneous , pairwise comparison , data mining , machine learning , decision support system , mathematics , marketing , combinatorics , politics , political science , law , business
Plethora of ensemble techniques have been implemented and studied in order to achieve better classification results than base classifiers. In this paper an algorithm for integration of decision trees is proposed, which means that homogeneous base classifiers will be used. The novelty of the presented approach is the usage of the simultaneous distance of the object from the decision boundary and the center of mass of objects belonging to one class label in order to determine the score functions of base classifiers. This means that the score function assigned to the class label by each classifier depends on the distance of the classified object from the decision boundary and from the centroid. The algorithm was evaluated using an open-source benchmarking dataset. The results indicate an improvement in the classification quality in comparison to the referential method - majority voting method.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom