Complexity of gradient descent for multiobjective optimization

被引:57
|
作者
Fliege, J. [1 ]
Vaz, A. I. F. [2 ]
Vicente, L. N. [3 ]
机构
[1] Univ Southampton, Sch Math Sci, Southampton, Hants, England
[2] Univ Minho, ALGORITMI Res Ctr, Braga, Portugal
[3] Univ Coimbra, Dept Math, CMUC, Coimbra, Portugal
来源
OPTIMIZATION METHODS & SOFTWARE | 2019年 / 34卷 / 05期
关键词
Multiobjective optimization; gradient descent; steepest descent; global rates; worst-case complexity; STEEPEST DESCENT; NEWTONS;
D O I
10.1080/10556788.2018.1510928
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
A number of first-order methods have been proposed for smooth multiobjective optimization for which some form of convergence to first-order criticality has been proved. Such convergence is global in the sense of being independent of the starting point. In this paper, we analyse the rate of convergence of gradient descent for smooth unconstrained multiobjective optimization, and we do it for non-convex, convex, and strongly convex vector functions. These global rates are shown to be the same as for gradient descent in single-objective optimization and correspond to appropriate worst-case complexity bounds. In the convex cases, the rates are given for implicit scalarizations of the problem vector function.
引用
收藏
页码:949 / 959
页数:11
相关论文
共 50 条
  • [41] Gradient-Based Multiobjective Optimization with Uncertainties
    Peitz, Sebastian
    Dellnitz, Michael
    [J]. NEO 2016: RESULTS OF THE NUMERICAL AND EVOLUTIONARY OPTIMIZATION WORKSHOP NEO 2016 AND THE NEO CITIES 2016 WORKSHOP, 2018, 731 : 159 - 182
  • [42] A Barzilai-Borwein descent method for multiobjective optimization problems
    Chen, Jian
    Tang, Liping
    Yang, Xinmin
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2023, 311 (01) : 196 - 209
  • [43] Local search for multiobjective function optimization: Pareto Descent Method
    Harada, Ken
    Sakuma, Jun
    Kobayashi, Shigenobu
    [J]. GECCO 2006: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOL 1 AND 2, 2006, : 659 - +
  • [44] A Descent Conjugate Gradient Method for Optimization Problems
    Semiu, Ayinde
    Idowu, Osinuga
    Adesina, Adio
    Sunday, Agboola
    Joseph, Adelodun
    Uchenna, Uka
    Olufisayo, Awe
    [J]. IAENG International Journal of Applied Mathematics, 2024, 54 (09) : 1765 - 1775
  • [45] Ant colony optimization and stochastic gradient descent
    Meuleau, N
    Dorigo, M
    [J]. ARTIFICIAL LIFE, 2002, 8 (02) : 103 - 121
  • [46] NOMA Codebook Optimization by Batch Gradient Descent
    Si, Zhongwei
    Wen, Shaoguo
    Dong, Bing
    [J]. IEEE ACCESS, 2019, 7 : 117274 - 117281
  • [47] Adaptive Gradient Multiobjective Particle Swarm Optimization
    Han, Honggui
    Lu, Wei
    Zhang, Lu
    Qiao, Junfei
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2018, 48 (11) : 3067 - 3079
  • [48] An accelerated proximal gradient method for multiobjective optimization
    Tanabe, Hiroki
    Fukuda, Ellen H.
    Yamashita, Nobuo
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 86 (02) : 421 - 455
  • [49] Multiobjective optimization strategies for linear gradient chromatography
    Nagrath, D
    Bequette, BW
    Cramer, SM
    Messac, A
    [J]. AICHE JOURNAL, 2005, 51 (02) : 511 - 525
  • [50] Stochastic gradient descent for optimization for nuclear systems
    Williams, Austin
    Walton, Noah
    Maryanski, Austin
    Bogetic, Sandra
    Hines, Wes
    Sobes, Vladimir
    [J]. SCIENTIFIC REPORTS, 2023, 13 (01)