A Sequential Subspace Quasi-Newton Method for Large-Scale Convex Optimization

被引:0
|
作者
Senov, Aleksandr [1 ,2 ,3 ]
Granichin, Oleg [1 ,2 ,3 ]
Granichina, Olga [4 ]
机构
[1] St Petersburg State Univ, Fac Math & Mech, Univ Sky Prospekt 28, St Petersburg 198504, Russia
[2] St Petersburg State Univ, Ctr Math Robot & Artificial Intelligence, Univ Sky Prospekt 28, St Petersburg 198504, Russia
[3] Russian Acad Sci, Inst Problems Mech Engn, Moscow 117901, Russia
[4] Herzen State Pedag Univ, St Petersburg, Russia
基金
俄罗斯科学基金会;
关键词
CONJUGATE-GRADIENT; CONVERGENCE; ALGORITHM;
D O I
10.23919/acc45564.2020.9147989
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and realworld optimization problems.
引用
收藏
页码:3627 / 3632
页数:6
相关论文
共 50 条
  • [1] Preconditioned Subspace Quasi-Newton Method for Large Scale Optimization
    Sim, Hong Seng
    Leong, Wah June
    Abu Hassan, Malik
    Ismail, Fudziah
    [J]. PERTANIKA JOURNAL OF SCIENCE AND TECHNOLOGY, 2014, 22 (01): : 175 - 192
  • [2] A STOCHASTIC QUASI-NEWTON METHOD FOR LARGE-SCALE OPTIMIZATION
    Byrd, R. H.
    Hansen, S. L.
    Nocedal, Jorge
    Singer, Y.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (02) : 1008 - 1031
  • [3] Preconditioning On Subspace Quasi-Newton Method For Large Scale Unconstrained Optimization
    Sim, Hong Seng
    Leong, Wah June
    Ismail, Fudziah
    [J]. STATISTICS AND OPERATIONAL RESEARCH INTERNATIONAL CONFERENCE (SORIC 2013), 2014, 1613 : 297 - 305
  • [4] A Stochastic Quasi-Newton Method for Large-Scale Nonconvex Optimization With Applications
    Chen, Huiming
    Wu, Ho-Chun
    Chan, Shing-Chow
    Lam, Wong-Hing
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (11) : 4776 - 4790
  • [5] A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization
    Ni, Q
    Yuan, Y
    [J]. MATHEMATICS OF COMPUTATION, 1997, 66 (220) : 1509 - 1520
  • [6] A Class of Diagonal Quasi-Newton Methods for Large-Scale Convex Minimization
    Leong, Wah June
    [J]. BULLETIN OF THE MALAYSIAN MATHEMATICAL SCIENCES SOCIETY, 2016, 39 (04) : 1659 - 1672
  • [7] A Class of Diagonal Quasi-Newton Methods for Large-Scale Convex Minimization
    Wah June Leong
    [J]. Bulletin of the Malaysian Mathematical Sciences Society, 2016, 39 : 1659 - 1672
  • [8] Scaling on Diagonal Quasi-Newton Update for Large-Scale Unconstrained Optimization
    Leong, Wah June
    Farid, Mahboubeh
    Abu Hassan, Malik
    [J]. BULLETIN OF THE MALAYSIAN MATHEMATICAL SCIENCES SOCIETY, 2012, 35 (02) : 247 - 256
  • [9] An efficient augmented memoryless quasi-Newton method for solving large-scale unconstrained optimization problems
    Cheng, Yulin
    Gao, Jing
    [J]. AIMS MATHEMATICS, 2024, 9 (09): : 25232 - 25252
  • [10] A COMMUNICATION EFFICIENT QUASI-NEWTON METHOD FOR LARGE-SCALE DISTRIBUTED MULTI-AGENT OPTIMIZATION
    Li, Yichuan
    Voulgaris, Petros G.
    Freris, Nikolaos M.
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4268 - 4272