Generalized conditional entropy and a metric splitting criterion for decision trees

被引:0
|
作者
Simovici, Dan A. [1 ]
Jaroszewicz, Szymon
机构
[1] Univ Massachusetts, Dept Comp Sci, Boston, MA 02125 USA
[2] Tech Univ Szczecin, Fac Comp & Informat Syst, Szeczin, Poland
来源
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS | 2006年 / 3918卷
关键词
decision tree; generalized conditional entropy; metric; metric betweenness;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We examine a new approach to building decision tree by introducing a geometric splitting criterion, based on the properties of a family of metrics on the space of partitions of a finite set. This criterion can be adapted to the characteristics of the data sets and the needs of the users and yields decision trees that have smaller sizes and fewer leaves than the trees built with standard methods and have comparable or better accuracy.
引用
收藏
页码:35 / 44
页数:10
相关论文
共 50 条
  • [1] A new metric splitting criterion for decision trees
    Simovici, Dan A.
    Jaroszewicz, Szymon
    INTERNATIONAL JOURNAL OF PARALLEL EMERGENT AND DISTRIBUTED SYSTEMS, 2006, 21 (04) : 239 - 256
  • [2] Decision Trees for Continuous Data and Conditional Mutual Information as a Criterion for Splitting Instances
    Drakakis, Georgios
    Moledina, Saadiq
    Chomenidis, Charalampos
    Doganis, Philip
    Sarimveis, Haralambos
    COMBINATORIAL CHEMISTRY & HIGH THROUGHPUT SCREENING, 2016, 19 (05) : 423 - 428
  • [3] A Discriminative Splitting Criterion for Phonetic Decision Trees
    Wiesler, Simon
    Heigold, Georg
    Nussbaum-Thom, Markus
    Schlueter, Ralf
    Ney, Hermann
    11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 1-2, 2010, : 54 - 57
  • [4] A discriminative splitting criterion for phonetic decision trees
    Wiesler, Simon
    Heigold, Georg
    Nußbaum-Thom, Markus
    Schlüter, Ralf
    Ney, Hermann
    Proceedings of the 11th Annual Conference of the International Speech Communication Association, INTERSPEECH 2010, 2010, : 54 - 57
  • [5] Hybrid Splitting Criterion in Decision Trees for Data Stream Mining
    Jaworski, Maciej
    Rutkowski, Leszek
    Pawlak, Miroslaw
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, (ICAISC 2016), PT II, 2016, 9693 : 60 - 72
  • [6] A parallel tree node splitting criterion for fuzzy decision trees
    Mu, Yashuang
    Liu, Xiaodong
    Wang, Lidong
    Asghar, Aamer Bilal
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2019, 31 (17):
  • [7] Testing Modified Confusion Entropy as Split Criterion for Decision Trees
    David Nunez-Gonzalez, J.
    Gonzalo de Sa, Alexander
    Grana, Manuel
    HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2019, 2019, 11734 : 3 - 13
  • [8] Improving Decision Trees by Tsallis Entropy Information Metric Method
    Wang, Yisen
    Song, Chaobing
    Xia, Shu-Tao
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 4729 - 4734
  • [9] UNIFYING ATTRIBUTE SPLITTING CRITERIA OF DECISION TREES BY TSALLIS ENTROPY
    Wang, Yisen
    Xia, Shu-Tao
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2507 - 2511
  • [10] A Novel Splitting Criterion Inspired by Geometric Mean Metric Learning for Decision Tree
    Li, Dan
    Chen, Songcan
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 4808 - 4814