共 50 条
"Calibeating": Beating forecasters at their own game
被引:0
|作者:
Foster, Dean P.
[1
,2
]
Hart, Sergiu
[3
,4
]
机构:
[1] Univ Penn, Dept Stat, Wharton, Philadelphia, PA 19104 USA
[2] Amazon, Seattle, WA 98109 USA
[3] Hebrew Univ Jerusalem, Inst Math, Dept Econ, Jerusalem, Israel
[4] Hebrew Univ Jerusalem, Federmann Ctr Study Rat, Jerusalem, Israel
关键词:
Calibrated forecasts;
calibeating;
Brier score;
calibration score;
refinement score;
experts;
C1;
C7;
D8;
RELATIVE LOSS BOUNDS;
CALIBRATION;
D O I:
10.3982/TE5330
中图分类号:
F [经济];
学科分类号:
02 ;
摘要:
To identify expertise, forecasters should not be tested by their calibration score, which can always be made arbitrarily small, but rather by their Brier score. The Brier score is the sum of the calibration score and the refinement score; the latter measures how good the sorting into bins with the same forecast is, and thus attests to "expertise." This raises the question of whether one can gain calibration without losing expertise, which we refer to as "calibeating." We provide an easy way to calibeat any forecast, by a deterministic online procedure. We moreover show that calibeating can be achieved by a stochastic procedure that is itself calibrated, and then extend the results to simultaneously calibeating multiple procedures, and to deterministic procedures that are continuously calibrated.
引用
收藏
页码:1441 / 1474
页数:34
相关论文