New error bounds for Solomonoff prediction

被引:17
|
作者
Hutter, M [1 ]
机构
[1] IDSIA, CH-6928 Lugano, Switzerland
关键词
induction; Solomonoff; Bayesian; deterministic prediction; algorithmic probability; Kolmogorov complexity;
D O I
10.1006/jcss.2000.1743
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Solomonoff sequence prediction is a scheme to predict digits of binary strings without knowing the underlying probability distribution. We call a prediction scheme informed when it knows the true probability distribution of the sequence. Several neu relations between universal Solomonoff sequence prediction and informed prediction and general probabilistic prediction schemes will be proved. Among others. they show that the number of errors in Solomonoff prediction is finite for computable distributions, if finite in the informed case. Deterministic variants will also be studied. The most interesting result is that the deterministic variant of Solomonoff prediction is optimal compared to any other probabilistic or deterministic prediction scheme apart from additive square root corrections only. This makes it well suited even for difficult prediction problems, where it does not suffice when the number of errors is minimal to within some factor greater than one. Solomonoff's original bound and the ones presented here complement each other in a useful way. (C) 2001 Academic Press.
引用
收藏
页码:653 / 667
页数:15
相关论文
共 50 条