Universal Online Convex Optimization Algorithm with Adaptivity to Gradient-Variation

被引:0
|
作者
Liu, Lang-Qi [1 ]
Zhang, Li-Jun [1 ]
机构
[1] National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023) (School of Artificial Intelligence, Nanjing University, Nanjing 210023
来源
关键词
Consensus algorithm - Convex optimization;
D O I
10.11897/SP.J.1016.2024.02629
中图分类号
学科分类号
摘要
Contrasting with online convex optimization algorithms designed for specific function types, universal algorithms are able to automatically adapt to various loss functions. This capability eliminates the need for users to correctly classify the type of loss function, thereby lowering the barrier to employing online convex optimization techniques. Previous studies have proposed algorithms that provide minimax optimal theoretical guarantees for several types of loss functions. However, they struggle to attain tighter theoretical guarantees that are correlated with the structure of the problem for general convex functions. To address this issue, we introduce the Universal Online Convex Optimization Algorithm with Adaptivity to Gradient-Variation (UAGV). This novel algorithm automatically adapts to both general convex and strongly convex loss functions. Furthermore, under the smooth condition, UAGV enjoys the gradient-variation bound for general convex loss functions, which is a problem-dependent bound. The algorithm adopts a two-layered structure, with a meta-algorithm in the upper layer and several expert-algorithms in the lower layer. We innovatively adopt the meta-algorithm with an optimism term, which can be interpreted as a prediction of the loss vector. When the optimism term closely matches the loss vector, the meta-algorithm achieves small regret. Thus, we carefully design the surrogate loss function and the optimism term according to the form of gradient-variation bound, enhancing the meta-algorithm’s ability of combining decisions generated by expert algorithms and helping to obtain the corresponding theoretical guarantee. Experimental results from several datasets indicate that the UAGV algorithm can effectively track the best expert-algorithm, and its optimization results for smooth general convex functions outperform those of existing universal algorithms. Specifically, the regret of UAGV is over 14 % smaller than that of existing algorithms on certain datasets. © 2024 Science Press. All rights reserved.
引用
收藏
页码:2629 / 2644
相关论文
共 50 条
  • [1] Gradient-Variation Bound for Online Convex Optimization with Constraints
    Qiu, Shuang
    Wei, Xiaohan
    Kolar, Mladen
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9534 - 9542
  • [2] Adaptivity and Optimality: A Universal Algorithm for Online Convex Optimization
    Wang, Guanghui
    Lu, Shiyin
    Zhang, Lijun
    35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 659 - 668
  • [3] Adapting to Smoothness: A More Universal Algorithm for Online Convex Optimization
    Wang, Guanghui
    Lu, Shiyin
    Hu, Yao
    Zhang, Lijun
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 6162 - 6169
  • [4] A stochastic conditional gradient algorithm for decentralized online convex optimization
    Nguyen Kim Thang
    Srivastav, Abhinav
    Trystram, Denis
    Youssef, Paul
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2022, 169 : 334 - 351
  • [5] Dual Adaptivity: A Universal Algorithm for Minimizing the Adaptive Regret of Convex Functions
    Zhang, Lijun
    Wang, Guanghui
    Tu, Wei-Wei
    Jiang, Wei
    Zhou, Zhi-Hua
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] Universal gradient methods for convex optimization problems
    Yu Nesterov
    Mathematical Programming, 2015, 152 : 381 - 404
  • [7] UNIVERSAL CONDITIONAL GRADIENT SLIDING FOR CONVEX OPTIMIZATION
    Ouyang, Yuyuan
    Squires, Trevor
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (04) : 2962 - 2987
  • [8] Universal gradient methods for convex optimization problems
    Nesterov, Yu
    MATHEMATICAL PROGRAMMING, 2015, 152 (1-2) : 381 - 404
  • [9] Quantum algorithm for online convex optimization
    He, Jianhao
    Yang, Feidiao
    Zhang, Jialin
    Li, Lvzhou
    QUANTUM SCIENCE AND TECHNOLOGY, 2022, 7 (02)
  • [10] Online Lazy Gradient Descent is Universal on Strongly Convex Domains
    Anderson, Daron
    Leith, Douglas
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,