Projective Approximation Based Gradient Descent Modification

被引:7
|
作者
Senov, Alexander [1 ,2 ]
Granichin, Oleg [1 ,2 ,3 ]
机构
[1] St Petersburg State Univ, Fac Math & Mech, 7-9 Univ Skaya Nab, St Petersburg 199034, Russia
[2] Russian Acad Sci, Inst Problems Mech Engn, 61 Bolshoy Pr, St Petersburg, Russia
[3] ITMO Univ, St Petersburg, Russia
来源
IFAC PAPERSONLINE | 2017年 / 50卷 / 01期
基金
俄罗斯基础研究基金会;
关键词
Mathematical programming; Parameter estimation; Steepest descent; Least-squares; Function approximation; Convex optimization; Model approximation; Iterative methods; Quadratic programming; Projective methods; OPTIMIZATION; ALGORITHMS;
D O I
10.1016/j.ifacol.2017.08.362
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a new modification of the gradient descent algorithm based on the surrogate optimization with projection into low-dimensional space. It iteratively approximates the target function in low-dimensional space and takes the approximation optimum point mapped back to original parameter space as next parameter estimate. Main contribution of the proposed method is in application of projection idea in approximation process. Major advantage of the proposed modification is that it does not change the gradient descent iterations, thus it can be used with some other variants of the gradient descent. We give a theoretical motivation for the proposed algorithm and a theoretical lower bound for its accuracy. Finally, we experimentally study its properties on modelled data. (C) 2017, IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved.
引用
收藏
页码:3899 / 3904
页数:6
相关论文
共 50 条
  • [1] Projective Fisher Information for Natural Gradient Descent
    Kaul, Piyush
    Lall, Brejesh
    [J]. IEEE Transactions on Artificial Intelligence, 2023, 4 (02): : 304 - 314
  • [2] A Gradient Descent Approximation for Graph Cuts
    Yildiz, Alparslan
    Akgul, Yusuf Sinan
    [J]. PATTERN RECOGNITION, PROCEEDINGS, 2009, 5748 : 312 - 321
  • [3] Accelerating Gradient Descent with Projective Response Surface Methodology
    Senov, Alexander
    [J]. LEARNING AND INTELLIGENT OPTIMIZATION (LION 11 2017), 2017, 10556 : 376 - 382
  • [4] On the diffusion approximation of nonconvex stochastic gradient descent
    Hu, Wenqing
    Li, Chris Junchi
    Li, Lei
    Liu, Jian-Guo
    [J]. ANNALS OF MATHEMATICAL SCIENCES AND APPLICATIONS, 2019, 4 (01) : 3 - 32
  • [5] Lightweight Projective Derivative Codes for Compressed Asynchronous Gradient Descent
    Soto, Pedro
    Ilmer, Ilia
    Guan, Haibin
    Li, Jun
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] A Robust Coverless Steganography Based on Generative Adversarial Networks and Gradient Descent Approximation
    Peng, Fei
    Chen, Guanfu
    Long, Min
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (09) : 5817 - 5829
  • [7] Gradient-free Stein variational gradient descent with kernel approximation
    Yan, Liang
    Zou, Xiling
    [J]. APPLIED MATHEMATICS LETTERS, 2021, 121
  • [8] Gradient descent algorithms for quantile regression with smooth approximation
    Zheng, Songfeng
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2011, 2 (03) : 191 - 207
  • [9] Gradient descent algorithms for quantile regression with smooth approximation
    Songfeng Zheng
    [J]. International Journal of Machine Learning and Cybernetics, 2011, 2 : 191 - 207
  • [10] Limitations of the Empirical Fisher Approximation for Natural Gradient Descent
    Kunstner, Frederik
    Balles, Lukas
    Hennig, Philipp
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32