A NEAREST HYPERRECTANGLE LEARNING-METHOD

被引:76
|
作者
SALZBERG, S
机构
关键词
EXEMPLAR; INDUCTION; GENERALIZATION; PREDICTION; INCREMENTAL LEARNING; EXCEPTIONS;
D O I
10.1007/BF00114779
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a theory of learning called nested generalized exemplar (NGE) theory, in which learning is accomplished by storing objects in Euclidean n-space, E(n), as hyperrectangles. The hyperrectangles may be nested inside one another to arbitrary depth. In contrast to generalization processes that replace symbolic formulae by more general formulae, the NGE algorithm modifies hyperrectangles by growing and reshaping them in a well-defined fashion. The axes of these hyperrectangles are defined by the variable measured for each example. Each variable can have any range on the real line; thus the theory is not restricted to symbolic or binary values. This paper describes some advantages and disadvantages of NGE theory, positions it as a form of exemplar-based learning, and compares it to other inductive learning theories. An implementation has been tested in three different domains, for which results are presented below: prediction of breast cancer, classification of iris flowers, and prediction of survival times for heart attack patients. The results in these domains support the claim that NGE theory can be used to create compact representations with excellent predictive accuracy.
引用
收藏
页码:251 / 276
页数:26
相关论文
共 50 条