z-logo
open-access-imgOpen Access
A New K ‐Ary Crisp Decision Tree Induction with Continuous Valued Attributes
Author(s) -
Song Yan,
Yao Shuang,
Yu Donghua,
Shen Yan,
Hu Yuzhen
Publication year - 2017
Publication title -
chinese journal of electronics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.267
H-Index - 25
eISSN - 2075-5597
pISSN - 1022-4653
DOI - 10.1049/cje.2017.07.015
Subject(s) - decision tree , tree (set theory) , mathematics , computer science , artificial intelligence , combinatorics
The simplicity and interpretability of decision tree induction makes it one of the more widely used machine learning methods for data classification. However, for continuous valued (real and integer) attribute data, there is room for further improvement in classification accuracy, complexity, and tree scale. We propose a new K ‐ary partition discretization method with no more than K −1 cut points based on Gaussian membership functions and the expected class number. A new K ‐ary crisp decision tree induction is also proposed for continuous valued attributes with a Gini index, combining the proposed discretization method. Experimental results and non‐parametric statistical tests on 19 real‐world datasets showed that the proposed algorithm outperforms four conventional approaches in terms of both classification accuracy, tree scale, and particularly tree depth. Considering the number of nodes, the proposed methods decision tree tends to be more balanced than in the other four methods. The complexity of the proposed algorithm was relatively low.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here