Generalized Haar DWT and transformations between decision trees and neural networks

被引:3
|
作者
Mulvaney, R [1 ]
Phatak, DS [1 ]
机构
[1] Univ Maryland Baltimore Cty, Dept Comp Sci & Elect Engn, Baltimore, MD 21250 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2006年 / 17卷 / 01期
基金
美国国家科学基金会;
关键词
decision tree rebalancing; fault tolerance; Haar wavelet; multiclass; rule generation; threshold network;
D O I
10.1109/TNN.2005.860830
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The core contribution of this paper is a three-fold improvement of the Haar discrete wavelet transform (DWT). It is modified to efficiently transform a multiclass- (rather than numerical-) valued function over a multidimensional (rather than low dimensional) domain, or transform a multiclass-valued decision tree into another useful representation. We prove that this multidimensional, multiclass DWT uses dynamic programming to minimize (within its framework) the number of nontrivial wavelet coefficients needed to summarize a training set or decision tree. It is a spatially localized algorithm that takes linear time in the number of training samples, after a sort. Convergence of the DWT to benchmark training sets seems to degrade with rising dimension in this test of high dimensional wavelets, which have been seen as difficult to implement. This multiclass multidimensional DWT has tightly coupled applications from learning "dyadic" decision trees directly from training data, rebalancing or converting preexisting decision trees to fixed depth boolean or threshold neural networks (in effect parallelizing the evaluation of the trees), or learning rule/exception sets represented as a new form of tree called an "E-tree," which could greatly help interpretation/visualization of a dataset.
引用
收藏
页码:81 / 93
页数:13
相关论文
共 50 条
  • [1] A comparison between neural networks and decision trees
    Jacobsen, C
    Zscherpel, U
    Perner, P
    [J]. MACHINE LEARNING AND DATA MINING IN PATTERN RECOGNITION, 1999, 1715 : 144 - 158
  • [2] On mapping decision trees and neural networks
    Setiono, R
    Leow, WK
    [J]. KNOWLEDGE-BASED SYSTEMS, 1999, 12 (03) : 95 - 99
  • [3] Decision trees based on neural networks
    CidSueiro, J
    Ghattas, J
    FigueirasVidal, AR
    [J]. INTELLIGENT METHODS IN SIGNAL PROCESSING AND COMMUNICATIONS, 1997, : 221 - 241
  • [4] CLASSIFICATION AND PREDICTION BY DECISION TREES AND NEURAL NETWORKS
    Prochazka, Michal
    Kouril, Lukas
    Zelinka, Ivan
    [J]. MENDELL 2009, 2009, : 177 - 181
  • [5] Initialization of neural networks by means of decision trees
    Ivanova, I
    Kubat, M
    [J]. KNOWLEDGE-BASED SYSTEMS, 1995, 8 (06) : 333 - 344
  • [6] Diversity between neural networks and decision trees for building multiple classifier systems
    Wang, WJ
    Jones, P
    Partridge, D
    [J]. MULTIPLE CLASSIFIER SYSTEMS, 2000, 1857 : 240 - 249
  • [7] STOCHASTIC INDUCTION OF DECISION TREES WITH APPLICATION TO LEARNING HAAR TREES
    Alizadeh, Azar
    Singhal, Mukesh
    Behzadan, Vahid
    Tavallali, Pooya
    Ranganath, Aditya
    [J]. 2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 825 - 830
  • [8] Tagging of corpora with HMM, decision trees and neural networks
    Schmid, H
    Kempe, A
    [J]. LEXICON AND TEST: REUSABLE METHODS AND RESOURCES FOR THE LINGUISTIC DEVELOPMENT OF GERMAN, 1996, 73 : 231 - 244
  • [9] Hybrid Ensembles of Decision Trees and Artificial Neural Networks
    Hsu, Kuo-Wei
    [J]. 2012 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND CYBERNETICS (CYBERNETICSCOM), 2012, : 25 - 29
  • [10] Visualizing surrogate decision trees of convolutional neural networks
    Shichao Jia
    Peiwen Lin
    Zeyu Li
    Jiawan Zhang
    Shixia Liu
    [J]. Journal of Visualization, 2020, 23 : 141 - 156