Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization

被引:0
|
作者
Pham Duy Khanh
Boris S. Mordukhovich
Vo Thanh Phat
Dat Ba Tran
机构
[1] Ho Chi Minh City University of Education,Group of Analysis and Applied Mathematics, Department of Mathematics
[2] Wayne State University,Department of Mathematics
来源
Mathematical Programming | 2024年 / 205卷
关键词
Nonsmooth optimization; Variational analysis; Generalized Newton methods; Global convergence; Linear and superlinear convergence rates; Convex composite optimization; Lasso problems; 90C31; 49J52; 49J53;
D O I
暂无
中图分类号
学科分类号
摘要
This paper proposes and justifies two globally convergent Newton-type methods to solve unconstrained and constrained problems of nonsmooth optimization by using tools of variational analysis and generalized differentiation. Both methods are coderivative-based and employ generalized Hessians (coderivatives of subgradient mappings) associated with objective functions, which are either of class C1,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${{\mathcal {C}}}^{1,1}$$\end{document}, or are represented in the form of convex composite optimization, where one of the terms may be extended-real-valued. The proposed globally convergent algorithms are of two types. The first one extends the damped Newton method and requires positive-definiteness of the generalized Hessians for its well-posedness and efficient performance, while the other algorithm is of the regularized Newton-type being well-defined when the generalized Hessians are merely positive-semidefinite. The obtained convergence rates for both methods are at least linear, but become superlinear under the semismooth∗\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$^*$$\end{document} property of subgradient mappings. Problems of convex composite optimization are investigated with and without the strong convexity assumption on smooth parts of objective functions by implementing the machinery of forward–backward envelopes. Numerical experiments are conducted for Lasso problems and for box constrained quadratic programs with providing performance comparisons of the new algorithms and some other first-order and second-order methods that are highly recognized in nonsmooth optimization.
引用
收藏
页码:373 / 429
页数:56
相关论文
共 50 条
  • [1] Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Phat, Vo Thanh
    Tran, Dat Ba
    [J]. MATHEMATICAL PROGRAMMING, 2024, 205 (1-2) : 373 - 429
  • [2] Coderivative-based semi-Newton method in nonsmooth difference programming
    Aragón-Artacho, Francisco J.
    Mordukhovich, Boris S.
    Pérez-Aros, Pedro
    [J]. Mathematical Programming, 2024,
  • [3] Coderivative-based semi-Newton method in nonsmooth difference programming
    Aragon-Artacho, Francisco J.
    Mordukhovich, Boris S.
    Perez-Aros, Pedro
    [J]. MATHEMATICAL PROGRAMMING, 2024,
  • [4] Globally convergent inexact generalized Newton's methods for nonsmooth equations
    Pu, DG
    Tian, WW
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2002, 138 (01) : 37 - 49
  • [5] GLOBALLY CONVERGENT NEWTON METHODS FOR NONSMOOTH EQUATIONS
    HAN, SP
    PANG, JS
    RANGARAJ, N
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 1992, 17 (03) : 586 - 607
  • [6] A globally convergent proximal Newton-type method in nonsmooth convex optimization
    Mordukhovich, Boris S.
    Yuan, Xiaoming
    Zeng, Shangzhi
    Zhang, Jin
    [J]. MATHEMATICAL PROGRAMMING, 2023, 198 (01) : 899 - 936
  • [7] A globally convergent proximal Newton-type method in nonsmooth convex optimization
    Boris S. Mordukhovich
    Xiaoming Yuan
    Shangzhi Zeng
    Jin Zhang
    [J]. Mathematical Programming, 2023, 198 : 899 - 936
  • [8] Globally convergent Newton-type methods for multiobjective optimization
    Goncalves, M. L. N.
    Lima, F. S.
    Prudente, L. F.
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 83 (02) : 403 - 434
  • [9] Globally convergent Newton-type methods for multiobjective optimization
    M. L. N. Gonçalves
    F. S. Lima
    L. F. Prudente
    [J]. Computational Optimization and Applications, 2022, 83 : 403 - 434
  • [10] Globally convergent inexact generalized Newton methods with decreasing norm of the gradient
    Pu, DG
    [J]. JOURNAL OF COMPUTATIONAL MATHEMATICS, 2002, 20 (03) : 289 - 300