Premium
Improving evolutionary decision tree induction with multi‐interval discretization
Author(s) -
Saremi Mehrin,
Yaghmaee Farzin
Publication year - 2018
Publication title -
computational intelligence
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.353
H-Index - 52
eISSN - 1467-8640
pISSN - 0824-7935
DOI - 10.1111/coin.12153
Subject(s) - decision tree , computer science , evolutionary algorithm , divide and conquer algorithms , machine learning , tree (set theory) , interval (graph theory) , incremental decision tree , artificial intelligence , discretization , algorithm , data mining , decision tree learning , pattern recognition (psychology) , mathematics , mathematical analysis , combinatorics
Decision trees are a widely used tool for pattern recognition and data mining. Over the last 4 decades, many algorithms have been developed for the induction of decision trees. Most of the classic algorithms use a greedy, divide‐and‐conquer search method to find an optimal tree, whereas recently evolutionary methods have been used to perform a global search in the space of possible trees. To the best of our knowledge, limited research has addressed the issue of multi‐interval decision trees. In this paper, we improve our previous work on multi‐interval trees and compare our previous and current work with a classic algorithm, ie, chi‐squared automatic interaction detection, and an evolutionary algorithm, ie, evtree. The results show that the proposed method improves on our previous method both in accuracy and in speed. It also outperforms chi‐squared automatic interaction detection and performs comparably to evtree. The trees generated by our method have more nodes but are shallower than those produced by evtree.