Attribute Selection Based on Constraint Gain and Depth Optimal for a Decision Tree

被引:2
|
作者
Sun, Huaining [1 ]
Hu, Xuegang [2 ]
Zhang, Yuhong [2 ]
机构
[1] Huainan Normal Univ, Sch Comp Sci, Huainan 232038, Peoples R China
[2] Hefei Univ & Technol, Sch Comp & Informat, Hefei 230009, Anhui, Peoples R China
来源
ENTROPY | 2019年 / 21卷 / 02期
关键词
decision tree; attribute selection measure; entropy; constraint entropy; constraint gain; branch convergence and fan-out; APPROXIMATE ENTROPY; MACHINE;
D O I
10.3390/e21020198
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Uncertainty evaluation based on statistical probabilistic information entropy is a commonly used mechanism for a heuristic method construction of decision tree learning. The entropy kernel potentially links its deviation and decision tree classification performance. This paper presents a decision tree learning algorithm based on constrained gain and depth induction optimization. Firstly, the calculation and analysis of single- and multi-value event uncertainty distributions of information entropy is followed by an enhanced property of single-value event entropy kernel and multi-value event entropy peaks as well as a reciprocal relationship between peak location and the number of possible events. Secondly, this study proposed an estimated method for information entropy whose entropy kernel is replaced with a peak-shift sine function to establish a decision tree learning (CGDT) algorithm on the basis of constraint gain. Finally, by combining branch convergence and fan-out indices under an inductive depth of a decision tree, we built a constraint gained and depth inductive improved decision tree (CGDIDT) learning algorithm. Results show the benefits of the CGDT and CGDIDT algorithms.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Attribute selection for decision tree learning with class constraint
    Sun, Huaining
    Hu, Xuegang
    [J]. CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2017, 163 : 16 - 23
  • [2] Gain Ratio as Attribute Selection Measure in Elegant Decision Tree to Predict Precipitation
    Prasad, Narasimha
    Naidu, Mannava Munirathnam
    [J]. 2013 8TH EUROSIM CONGRESS ON MODELLING AND SIMULATION (EUROSIM), 2013, : 141 - 150
  • [3] Learning the Attribute Selection Measures for Decision Tree
    Chen, Xiaolin
    Wu, Jia
    Cai, Zhihua
    [J]. FIFTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2012): ALGORITHMS, PATTERN RECOGNITION AND BASIC TECHNOLOGIES, 2013, 8784
  • [4] A DISTANCE-BASED ATTRIBUTE SELECTION MEASURE FOR DECISION TREE INDUCTION
    DEMANTARAS, RL
    [J]. MACHINE LEARNING, 1991, 6 (01) : 81 - 92
  • [5] Comparative Analysis of Attribute Selection Measures Used for Attribute Selection in Decision Tree Induction
    Bhatt, Advait S.
    [J]. 2012 INTERNATIONAL CONFERENCE ON RADAR, COMMUNICATION AND COMPUTING (ICRCC), 2012, : 230 - 234
  • [6] An improved attribute selection measure for decision tree induction
    Wang, Dianhong
    Jiang, Liangxiao
    [J]. FOURTH INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, VOL 4, PROCEEDINGS, 2007, : 654 - +
  • [7] THE IMPORTANCE OF ATTRIBUTE SELECTION MEASURES IN DECISION TREE INDUCTION
    LIU, WZ
    WHITE, AP
    [J]. MACHINE LEARNING, 1994, 15 (01) : 25 - 41
  • [8] The optimization of attribute selection in decision tree-based production control systems
    Shiue, YR
    Guh, RS
    [J]. INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2006, 28 (7-8): : 737 - 746
  • [9] The optimization of attribute selection in decision tree-based production control systems
    Yeou-Ren Shiue
    Ruey-Shiang Guh
    [J]. The International Journal of Advanced Manufacturing Technology, 2006, 28 : 737 - 746
  • [10] Optimal constraint-based decision tree induction from itemset lattices
    Siegfried Nijssen
    Elisa Fromont
    [J]. Data Mining and Knowledge Discovery, 2010, 21 : 9 - 51