On the boosting algorithm far multiclass functions based on information-theoretic criterion for approximation

被引:0
|
作者
Takimoto, E [1 ]
Maruoka, A [1 ]
机构
[1] Tohoku Univ, Grad Sch Informat Sci, Aoba Ku, Sendai, Miyagi 9808579, Japan
来源
DISCOVERY SCIENCE | 1998年 / 1532卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the boosting technique that can be directly applied to the classification problem for multiclass functions. Although many boosting algorithms have been proposed so far, all of them are essentially developed for binary classification problems, and in order to handle multiclass classification problems, they need the problems reduced somehow to binary ones. In order to avoid such reductions, we introduce a notion of the pseudo-entropy function G that gives an information-theoretic criterion, called the conditional G-entropy, for measuring the loss of hypotheses. The conditional G-entropy turns out to be useful for defining the weakness of hypotheses that approximate, in some way, to a multiclass function in general, so that we can consider the boosting problem without reduction. We show that the top-down decision tree learning algorithm using G as its splitting criterion is an efficient boosting algorithm based on the conditional G-entropy. Namely, the algorithm intends to minimize the conditional G-entropy, rather than the classification error. In the binary case, our algorithm is identical to the error-based boosting algorithm proposed by Kearns and Mansour, and our analysis gives a simpler proof of their results.
引用
下载
收藏
页码:256 / 267
页数:12
相关论文
共 50 条