Data abstractions for decision tree induction

被引:4
|
作者
Kudoh, Y [1 ]
Haraguchi, M [1 ]
Okubo, Y [1 ]
机构
[1] Hokkaido Univ, Div Elect & Informat Engn, Sapporo, Hokkaido 0608628, Japan
关键词
data mining; machine learning; abstraction; classification;
D O I
10.1016/S0304-3975(02)00178-0
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
When descriptions of data values in a database are too concrete or too detailed, the computational complexity needed to discover useful knowledge from the database will be generally increased. Furthermore, discovered knowledge tends to become complicated. A notion of data abstraction seems useful to resolve this kind of problems, as we obtain a smaller and more general database after the abstraction, from which we can quickly extract more abstract knowledge that is expected to be easier to understand. In general, however, since there exist several possible abstractions, we have to carefully select one according to which the original database is generalized. An inadequate selection would make the accuracy of extracted knowledge worse. From this point of view, we propose in this paper a method of selecting an appropriate abstraction from possible ones, assuming that our task is to construct a decision tree from a relational database. Suppose that, for each attribute in a relational database, we have a class of possible abstractions for the attribute values. As an appropriate abstraction for each attribute, we prefer an abstraction such that, even after the abstraction, the distribution of target classes necessary to perform our classification task can be preserved within an acceptable error range given by user. By the selected abstractions, the original database can be transformed into a small generalized database written in abstract values. Therefore, it would be expected that, from the generalized database, we can construct a decision tree whose size is much smaller than one constructed from the original database. Furthermore, such a size reduction can be justified under some theoretical assumptions. The appropriateness of abstraction is precisely defined in terms of the standard information theory. Therefore, we call our abstraction framework Information Theoretical Abstraction. We show some experimental results obtained by a system ITA that is an implementation of our abstraction method. From those results, it is verified that our method is very effective in reducing the size of detected decision tree without making classification errors so worse. (C) 2002 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:387 / 416
页数:30
相关论文
共 50 条
  • [41] An evolutionary algorithm for oblique decision tree induction
    Kretowski, M
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING - ICAISC 2004, 2004, 3070 : 432 - 437
  • [42] VQTree: Vector quantization for decision tree induction
    Geva, S
    Buckingham, L
    KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS: CURRENT ISSUES AND NEW APPLICATIONS, 2000, 1805 : 349 - 359
  • [43] AN IMPROVED ORDINAL DECISION TREE INDUCTION ALGORITHM
    Pan, Pan
    Zhai, Junhai
    Chen, Wu
    PROCEEDINGS OF 2015 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION (ICWAPR), 2015, : 220 - 224
  • [44] Decision Tree Induction Methods for Distributed Environment
    Walkowiak, Krzysztof
    Wozniak, Michal
    MAN-MACHINE INTERACTIONS, 2009, 59 : 201 - 208
  • [45] Hyper-heuristic Decision Tree Induction
    Vella, Alan
    Corne, David
    Murphy, Chris
    2009 WORLD CONGRESS ON NATURE & BIOLOGICALLY INSPIRED COMPUTING (NABIC 2009), 2009, : 408 - +
  • [46] Characterization and parallelization of decision-tree induction
    Bradford, JP
    Fortes, JAB
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2001, 61 (03) : 322 - 349
  • [47] Algorithm for merging of branches in decision tree induction
    Bi, Jiandong
    Yang, Guifang
    Harbin Gongye Daxue Xuebao/Journal of Harbin Institute of Technology, 1997, 29 (02): : 44 - 46
  • [48] Multi-relational decision tree induction
    Knobbe, AJ
    Siebes, A
    van der Wallen, D
    PRINCIPLES OF DATA MINING AND KNOWLEDGE DISCOVERY, 1999, 1704 : 378 - 383
  • [49] Parallel genetic programming for decision tree induction
    Folino, G
    Pizzuti, C
    Spezzano, C
    ICTAI 2001: 13TH IEEE INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2001, : 129 - 135
  • [50] THE USE OF BACKGROUND KNOWLEDGE IN DECISION TREE INDUCTION
    NUNEZ, M
    MACHINE LEARNING, 1991, 6 (03) : 231 - 250