MetaGrad: Adaptation using Multiple Learning Rates in Online Learning

被引:0
|
作者
van Erven, Tim [1 ]
Koolen, Wouter M. [2 ]
van der Hoeven, Dirk [3 ]
机构
[1] Univ Amsterdam, Korteweg Vries Inst Math, Sci Pk 107, NL-1098 XG Amsterdam, Netherlands
[2] Ctr Wiskunde & Informat, Sci Pk 123, NL-1098 XG Amsterdam, Netherlands
[3] Leiden Univ, Math Inst, Niels Bohrweg 1, NL-2300 RA Leiden, Netherlands
关键词
online convex optimization; adaptivity; FREQUENT DIRECTIONS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We provide a new adaptive method for online convex optimization, MetaGrad, that is robust to general convex losses but achieves faster rates for a broad class of special functions, including exp-concave and strongly convex functions, but also various types of stochastic and non-stochastic functions without any curvature. We prove this by drawing a connection to the Bernstein condition, which is known to imply fast rates in offline statistical learning. MetaGrad further adapts automatically to the size of the gradients. Its main feature is that it simultaneously considers multiple learning rates, which are weighted directly proportional to their empirical performance on the data using a new meta-algorithm. We provide three versions of MetaGrad. The full matrix version maintains a full covariance matrix and is applicable to learning tasks for which we can afford update time quadratic in the dimension. The other two versions provide speed-ups for high-dimensional learning tasks with an update time that is linear in the dimension: one is based on sketching, the other on running a separate copy of the basic algorithm per coordinate. We evaluate all versions of MetaGrad on benchmark online classification and regression tasks, on which they consistently outperform both online gradient descent and AdaGrad.
引用
收藏
页数:61
相关论文
共 50 条
  • [41] Online sensorimotor learning and adaptation for inverse dynamics control
    Xiong, Xiaofeng
    Manoonpong, Poramate
    NEURAL NETWORKS, 2021, 143 : 525 - 536
  • [42] An Online Learning Framework for Link Adaptation in Wireless Networks
    Daniels, Robert C.
    Heath, Robert W., Jr.
    2009 INFORMATION THEORY AND APPLICATIONS WORKSHOP, 2009, : 135 - 137
  • [43] SOLSA: Neuromorphic Spatiotemporal Online Learning for Synaptic Adaptation
    Zhang, Zhenhang
    Jin, Jingang
    Fang, Haowen
    Qiu, Qinru
    29TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE, ASP-DAC 2024, 2024, : 848 - 853
  • [44] Online classifier adaptation for cost-sensitive learning
    Zhang, Junlin
    Garcia, Jose
    NEURAL COMPUTING & APPLICATIONS, 2016, 27 (03): : 781 - 789
  • [45] COAL: Convolutional Online Adaptation Learning for Opinion Mining
    Chaturvedi, Iti
    Ragusa, Edoardo
    Gastaldo, Paolo
    Cambria, Erik
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2020), 2020, : 15 - 22
  • [46] Learning Intention Aware Online Adaptation of Movement Primitives
    Koert, Dorothea
    Pajarinen, Joni
    Schotschneider, Albert
    Trick, Susanne
    Rothkopf, Constantin
    Peters, Jan
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (04) : 3719 - 3726
  • [47] INCREASED RATES OF CONVERGENCE THROUGH LEARNING RATE ADAPTATION
    JACOBS, RA
    NEURAL NETWORKS, 1988, 1 (04) : 295 - 307
  • [48] Deep Reinforcement Learning using Cyclical Learning Rates
    Gulde, Ralf
    Tuscher, Marc
    Csiszar, Akos
    Riedel, Oliver
    Verl, Alexander
    2020 THIRD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE FOR INDUSTRIES (AI4I 2020), 2020, : 32 - 35
  • [49] Online Virtual Repellent Point Adaptation for Biped Walking using Iterative Learning Control
    Wang, Shengzhi
    Mesesan, George
    Englsberger, Johannes
    Lee, Dongheui
    Ott, Christian
    PROCEEDINGS OF THE 2020 IEEE-RAS 20TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS 2020), 2021, : 112 - 119
  • [50] Continuous Model Adaptation Using Online Meta-Learning for Smart Grid Application
    Li, Jinghang
    Hu, Mengqi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (08) : 3633 - 3642