The generalization performance of learning machine based on phi-mixing sequence

被引:0
|
作者
Zou, Bin [1 ]
Li, Luoqing [2 ]
机构
[1] Hubei Univ, Fac Math & Comp Sci, Wuhan 430062, Peoples R China
[2] Hubei Univ, Fac Math & Comp Sci, Wuhan 430062, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The generalization performance is the important property of learning machines. It has been shown previously by Vapnik, Cucker and Smale that, the empirical risks of learning machine based on i.i.d. sequence must uniformly converge to their expected risks as the number of samples approaches infinity. This paper extends the results to the case where the i.i.d. sequence is replaced by phi-mixing sequence. We establish the rate of uniform convergence of learning machine by using Bernstein's inequality for phi-mixing sequence, and estimate the sample error of learning machine. In the end, we compare these bounds with known results.
引用
收藏
页码:548 / +
页数:2
相关论文
共 50 条