Precise Regret Bounds for Log-loss via a Truncated Bayesian Algorithm

被引:0
|
作者
Wu, Changlong [1 ]
Heidari, Mohsen [1 ,2 ]
Grama, Ananth [1 ]
Szpankowski, Wojciech [1 ]
机构
[1] Purdue Univ, CSoI, W Lafayette, IN 47907 USA
[2] Indiana Univ, Bloomington, IN USA
关键词
REDUNDANCY; INFORMATION; PRINCIPLE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study sequential general online regression, known also as sequential probability assignments, under logarithmic loss when compared against a broad class of experts. We obtain tight, often matching, lower and upper bounds for sequential minimax regret, which is defined as the excess loss incurred by the predictor over the best expert in the class. After proving a general upper bound we consider some specific classes of experts from Lipschitz class to bounded Hessian class and derive matching lower and upper bounds with provably optimal constants. Our bounds work for a wide range of values of the data dimension and the number of rounds. To derive lower bounds, we use tools from information theory (e.g., Shtarkov sum), and for upper bounds we resort to new "smooth truncated covering" of the class of experts. This allows us to find constructive proofs by applying a simple and novel truncated Bayesian algorithm. Our proofs are substantially simpler than the existing ones and yet provide tighter (and often optimal) bounds.
引用
收藏
页数:12
相关论文
共 5 条
  • [1] Regret Bounds for Log-Loss via Bayesian Algorithms
    Wu, Changlong
    Heidari, Mohsen
    Grama, Ananth
    Szpankowski, Wojciech
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (09) : 5971 - 5989
  • [2] PACMAN: PAC-style bounds accounting for the Mismatch between Accuracy and Negative log-loss
    Vera, Matias
    Vega, Leonardo Rey
    Piantanida, Pablo
    [J]. INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2024, 13 (01)
  • [3] Tight Bounds on Minimax Regret under Logarithmic Loss via Self-Concordance
    Bilodeau, Blair
    Foster, Dylan J.
    Roy, Daniel M.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [4] Tight Bounds on Minimax Regret under Logarithmic Loss via Self-Concordance
    Bilodeau, Blair
    Foster, Dylan J.
    Roy, Daniel M.
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [5] Large-scale robust regression with truncated loss via majorization-minimization algorithm
    Huang, Ling-Wei
    Shao, Yuan-Hai
    Lv, Xiao-Jing
    Li, Chun-Na
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2024, 319 (02) : 494 - 504