Error Awareness by Lower and Upper Bounds in Ensemble Learning

被引:0
|
作者
Liu, Yong [1 ]
Zhao, Qiangfu [1 ]
Pei, Yan [1 ]
机构
[1] Univ Aizu, Sch Comp Sci & Engn, Fukushima 9658580, Japan
关键词
Decision boundary; ensemble learning; error bounds; NEURAL-NETWORKS; CONSTRUCTIVE ALGORITHMS;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Ensemble learning system could lower down risk of overfitting that often appears in supervised learning for a single learning model. However, overfitting had still been observed in negative correlation learning that trains a set of neural networks simultaneously with correlation-based penalties. In negative correlation learning, each subsystem could see all training data, and focus on those data that has not been learned well by the other subsystems in the ensemble. One cost of learning all data points is that the learned decision boundary could get too closer to some data points. Such decision boundary might not give the better generalization even if it could provide the higher accuracy on the training data. Two constraints are introduced into negative correlation learning for preventing overfitting. One is the lower bound of error rate (LBER). The other is the upper bound of error output (UBEO). These two error bounds would decide whether to learn a certain data point. Experimental results would explore how LBER and UBEO would lead negative correlation learning towards a better decision boundary.
引用
收藏
页码:14 / 18
页数:5
相关论文
共 50 条