Complexity of gradient descent for multiobjective optimization

被引:57
|
作者
Fliege, J. [1 ]
Vaz, A. I. F. [2 ]
Vicente, L. N. [3 ]
机构
[1] Univ Southampton, Sch Math Sci, Southampton, Hants, England
[2] Univ Minho, ALGORITMI Res Ctr, Braga, Portugal
[3] Univ Coimbra, Dept Math, CMUC, Coimbra, Portugal
来源
OPTIMIZATION METHODS & SOFTWARE | 2019年 / 34卷 / 05期
关键词
Multiobjective optimization; gradient descent; steepest descent; global rates; worst-case complexity; STEEPEST DESCENT; NEWTONS;
D O I
10.1080/10556788.2018.1510928
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
A number of first-order methods have been proposed for smooth multiobjective optimization for which some form of convergence to first-order criticality has been proved. Such convergence is global in the sense of being independent of the starting point. In this paper, we analyse the rate of convergence of gradient descent for smooth unconstrained multiobjective optimization, and we do it for non-convex, convex, and strongly convex vector functions. These global rates are shown to be the same as for gradient descent in single-objective optimization and correspond to appropriate worst-case complexity bounds. In the convex cases, the rates are given for implicit scalarizations of the problem vector function.
引用
收藏
页码:949 / 959
页数:11
相关论文
共 50 条
  • [11] The Complexity of Gradient Descent: CLS = PPAD ∧ PLS
    Fearnley, John
    Goldberg, Paul W.
    Hollender, Alexandros
    Savani, Rahul
    [J]. STOC '21: PROCEEDINGS OF THE 53RD ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2021, : 46 - 59
  • [12] Complexity control by gradient descent in deep networks
    Tomaso Poggio
    Qianli Liao
    Andrzej Banburski
    [J]. Nature Communications, 11
  • [13] Complexity control by gradient descent in deep networks
    Poggio, Tomaso
    Liao, Qianli
    Banburski, Andrzej
    [J]. NATURE COMMUNICATIONS, 2020, 11 (01)
  • [14] Gain Optimization for SMCSPO with Gradient Descent
    Lee, Jin Hyeok
    Ryu, Hyeon Jae
    Lee, Min Cheol
    [J]. 2021 21ST INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2021), 2021, : 575 - 578
  • [15] Combined gradient methods for multiobjective optimization
    Wang, Peng
    Zhu, Detong
    [J]. JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2022, 68 (04) : 2717 - 2741
  • [16] Memory gradient method for multiobjective optimization
    Chen, Wang
    Yang, Xinmin
    Zhao, Yong
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2023, 443
  • [17] Combined gradient methods for multiobjective optimization
    Peng Wang
    Detong Zhu
    [J]. Journal of Applied Mathematics and Computing, 2022, 68 : 2717 - 2741
  • [18] Explicit gradient information in multiobjective optimization
    Garcia-Palomares, Ubaldo M.
    Burguillo-Rial, Juan C.
    Gonzalez-Castano, Francisco J.
    [J]. OPERATIONS RESEARCH LETTERS, 2008, 36 (06) : 722 - 725
  • [19] A Descent Method for Nonsmooth Multiobjective Optimization in Hilbert Spaces
    Sonntag, Konstantin
    Gebken, Bennet
    Mueller, Georg
    Peitz, Sebastian
    Volkwein, Stefan
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, : 455 - 487
  • [20] Conditional gradient method for multiobjective optimization
    Assuncao, P. B.
    Ferreira, O. P.
    Prudente, L. F.
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 78 (03) : 741 - 768