A NEAREST HYPERRECTANGLE LEARNING-METHOD

被引:76
|
作者
SALZBERG, S
机构
关键词
EXEMPLAR; INDUCTION; GENERALIZATION; PREDICTION; INCREMENTAL LEARNING; EXCEPTIONS;
D O I
10.1007/BF00114779
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a theory of learning called nested generalized exemplar (NGE) theory, in which learning is accomplished by storing objects in Euclidean n-space, E(n), as hyperrectangles. The hyperrectangles may be nested inside one another to arbitrary depth. In contrast to generalization processes that replace symbolic formulae by more general formulae, the NGE algorithm modifies hyperrectangles by growing and reshaping them in a well-defined fashion. The axes of these hyperrectangles are defined by the variable measured for each example. Each variable can have any range on the real line; thus the theory is not restricted to symbolic or binary values. This paper describes some advantages and disadvantages of NGE theory, positions it as a form of exemplar-based learning, and compares it to other inductive learning theories. An implementation has been tested in three different domains, for which results are presented below: prediction of breast cancer, classification of iris flowers, and prediction of survival times for heart attack patients. The results in these domains support the claim that NGE theory can be used to create compact representations with excellent predictive accuracy.
引用
收藏
页码:251 / 276
页数:26
相关论文
共 50 条
  • [41] A new method of learning weighted similarity function to improve predictions of Nearest Neighbor rule
    Jahromi, M. Zolghadri
    Parvinnia, E.
    WORLD CONGRESS ON ENGINEERING 2008, VOLS I-II, 2008, : 54 - 57
  • [42] A novel hierarchical feature selection method based on large margin nearest neighbor learning
    Zheng, Jian
    Luo, Chuan
    Li, Tianrui
    Chen, Hongmei
    NEUROCOMPUTING, 2022, 497 : 1 - 12
  • [43] Real-time flaw detection on complex part:: Study of SVM and hyperrectangle based method
    Bouillant, S
    Mitéran, J
    Bourennane, E
    Paindavoine, M
    2002 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I-IV, PROCEEDINGS, 2002, : 3596 - 3599
  • [44] A LEARNING SCHEME FOR NEAREST NEIGHBOUR CLASSIFIER
    FORD, NL
    BATCHELOR, BG
    WILKINS, BR
    INFORMATION SCIENCES, 1970, 2 (02) : 139 - +
  • [45] Q-learning with Nearest Neighbors
    Shah, Devavrat
    Xie, Qiaomin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [46] Learning to Index for Nearest Neighbor Search
    Chiu, Chih-Yi
    Prayoonwong, Amorntip
    Liao, Yin-Chih
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (08) : 1942 - 1956
  • [47] A Symmetric Nearest Neighbor learning rule
    Nock, R
    Sebban, M
    Jappy, P
    ADVANCES IN CASE-BASED REASONING, PROCEEDINGS, 2001, 1898 : 222 - 233
  • [48] Real-time flaw detection on complex part:: classification with SVM and Hyperrectangle based method
    Bouillant, S
    Mitéran, J
    Paindavoine, M
    Mériaudeau, F
    MACHIINE VISION APPLICATIONS IN INDUSTRIAL INSPECTION XII, 2004, 5303 : 170 - 177
  • [49] Similarity Learning for Nearest Neighbor Classification
    Qamar, Ali Mustafa
    Gaussier, Eric
    Chevallet, Jean-Pierre
    Lim, Joo Hwee
    ICDM 2008: EIGHTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2008, : 983 - +
  • [50] Reliable Nearest Neighbors for Lazy Learning
    Ebert, Tobias
    Kampmann, Geritt
    Nelles, Oliver
    2011 AMERICAN CONTROL CONFERENCE, 2011, : 3041 - 3046