Boosting density function estimators

被引:0
|
作者
Thollard, F [1 ]
Sebban, M [1 ]
Ezequel, P [1 ]
机构
[1] Univ St Etienne, Dept Comp Sci, EURISE, St Etienne, France
来源
MACHINE LEARNING: ECML 2002 | 2002年 / 2430卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we focus on the adaptation of boosting to density function estimation, useful in a number of fields including Natural Language Processing and Computational Biology. Previously, boosting has been used to optimize classification algorithms, improving generalization accuracy by combining many classifiers. The core of the boosting strategy, in the well-known ADABOOST algorithm [4], consists in updating the learning instance distribution, increasing (resp. decreasing) the weight of misclassified (resp. correctly classified) examples by the current classifier. Except in [17, 18], few works have attempted to exploit interesting theoretical properties of boosting (such as margin maximization) independently of a classification task. In this paper, we do not take into account classification errors to optimize a classifier, but rather density estimation errors to optimize an estimator (here a probabilistic automaton) of a given target density. Experimental results axe presented showing the interest of our approach.
引用
收藏
页码:431 / 443
页数:13
相关论文
共 50 条