A Unified Framework for Decision Tree on Continuous Attributes

被引:5
|
作者
Yan, Jianjian [1 ]
Zhang, Zhongnan [1 ]
Xie, Lingwei [1 ]
Zhu, Zhantu [1 ]
机构
[1] Xiamen Univ, Software Sch, Xiamen 361005, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Decision tree; classification; unified framework; split criteria; CLASSIFICATION; NETWORKS; SVM;
D O I
10.1109/ACCESS.2019.2892083
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The standard algorithms of decision trees and their derived methods are usually constructed on the basis of the frequency information. However, they still suffer from a dilemma or multichotomous question for continuous attributes when two or more candidate cut points have the same or similar splitting performance with the optimal value, such as the maximal information gain ratio or the minimal Gini index. In this paper, we propose a unified framework model to deal with this question. We then design two algorithms based on Splitting Performance and the number of Expected Segments, called SPES1 and SPES2, which determine the optimal cut point, as follows. First, several candidate cut points are selected based on their splitting performances being the closest to the optimal. Second, we compute the number of expected segments for each candidate cut point. Finally, we combine these two measures by introducing a weighting factor alpha to determine the optimal one from several candidate cut points. To validate the effectiveness of our methods, we perform them on 25 benchmark datasets. The experimental results demonstrate that the classification accuracies of the proposed algorithms are superior to the current state-of-the-art methods in tackling the multichotomous question, about 5% in some cases. In particular, according to the proposed methods, the number of candidate cut points converges to a certain extent.
引用
收藏
页码:11924 / 11933
页数:10
相关论文
共 50 条
  • [21] Rough Set Based Attributes Partition in Decision Tree
    Yu, Xingxing
    Xie, Jinli
    Hu, Haiqing
    2017 CHINESE AUTOMATION CONGRESS (CAC), 2017, : 5929 - 5932
  • [22] Tree-Based Kernel for Graphs With Continuous Attributes
    Martino, Giovanni Da San
    Navarin, Nicolo
    Sperduti, Alessandro
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) : 3270 - 3276
  • [23] Decision Tree Models of Continuous Systems
    Plambeck, Swantje
    Fey, Goerschwin
    2022 IEEE 27TH INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA), 2022,
  • [24] A unified framework for addiction: Vulnerabilities in the decision process
    Redish, A. David
    Jensen, Steve
    Johnson, Adam
    BEHAVIORAL AND BRAIN SCIENCES, 2008, 31 (04) : 415 - +
  • [25] T3C: improving a decision tree classification algorithm's interval splits on continuous attributes
    Tzirakis, Panagiotis
    Tjortjis, Christos
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2017, 11 (02) : 353 - 370
  • [26] T3C: improving a decision tree classification algorithm’s interval splits on continuous attributes
    Panagiotis Tzirakis
    Christos Tjortjis
    Advances in Data Analysis and Classification, 2017, 11 : 353 - 370
  • [27] Filtering Decision Rules with Continuous Attributes Governed by Discretisation
    Stanczyk, Urszula
    FOUNDATIONS OF INTELLIGENT SYSTEMS, ISMIS 2017, 2017, 10352 : 333 - 343
  • [28] Improvement of decision accuracy using discretization of continuous attributes
    Wu, QingXiang
    Bell, David
    McGinnity, Martin
    Prasad, Girijesh
    Qi, Guilin
    Huang, Xi
    FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, PROCEEDINGS, 2006, 4223 : 674 - 683
  • [29] Decision tree classification method based on correlation between attributes
    Wang, D.
    Yu, G.
    Bao, Y.
    Wang, G.
    2001, Northeastern University (22):
  • [30] Constructing X-of-N Attributes for Decision Tree Learning
    Zijian Zheng
    Machine Learning, 2000, 40 : 35 - 75