SCORE: approximating curvature information under self-concordant regularization

被引:1
|
作者
Adeoye, Adeyemi D. [1 ]
Bemporad, Alberto [1 ]
机构
[1] IMT Sch Adv Studies, Piazza San Francesco 19, I-55100 Lucca, Italy
关键词
Self-concordant functions; Gauss-Newton methods; Convex optimization; Overparameterized models; OPTIMIZATION METHODS; ALGORITHM;
D O I
10.1007/s10589-023-00502-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Optimization problems that include regularization functions in their objectives are regularly solved in many applications. When one seeks second-order methods for such problems, it may be desirable to exploit specific properties of some of these regularization functions when accounting for curvature information in the solution steps to speed up convergence. In this paper, we propose the SCORE (self-concordant regularization) framework for unconstrained minimization problems which incorporates second-order information in the Newton-decrement framework for convex optimization. We propose the generalized Gauss-Newton with Self-Concordant Regularization (GGN-SCORE) algorithm that updates the minimization variables each time it receives a new input batch. The proposed algorithm exploits the structure of the second-order information in the Hessian matrix, thereby reducing computational overhead. GGN-SCORE demonstrates how to speed up convergence while also improving model generalization for problems that involve regularized minimization under the proposed SCORE framework. Numerical experiments show the efficiency of our method and its fast convergence, which compare favorably against baseline first-order and quasi-Newton methods. Additional experiments involving non-convex (overparameterized) neural network training problems show that the proposed method is promising for non-convex optimization.
引用
收藏
页码:599 / 626
页数:28
相关论文
共 50 条
  • [1] SCORE: approximating curvature information under self-concordant regularization
    Adeyemi D. Adeoye
    Alberto Bemporad
    Computational Optimization and Applications, 2023, 86 : 599 - 626
  • [2] Beyond Tikhonov: Faster Learning with Self-Concordant Losses via Iterative Regularization
    Beugnot, Gaspard
    Mairal, Julien
    Rudi, Alessandro
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Projectively Self-Concordant Barriers
    Hildebrand, Roland
    MATHEMATICS OF OPERATIONS RESEARCH, 2022, 47 (03) : 2444 - 2463
  • [4] Composite self-concordant minimization
    Tran-Dinh, Quoc
    Kyrillidis, Anastasios
    Cevher, Volkan
    Journal of Machine Learning Research, 2015, 16 : 371 - 416
  • [5] Composite Self-Concordant Minimization
    Tran-Dinh, Quoc
    Kyrillidis, Anastasios
    Cevher, Volkan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2015, 16 : 371 - 416
  • [6] A Multilevel Method for Self-Concordant Minimization
    Tsipinakis, Nick
    Parpas, Panos
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, : 2509 - 2559
  • [7] New self-concordant barrier for the hypercube
    Quiroz, E. A. Papa
    Oliveira, P. R.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2007, 135 (03) : 475 - 490
  • [8] Self-concordant barriers for hyperbolic means
    Lewis A.S.
    Sendov H.S.
    Mathematical Programming, 2001, 91 (1) : 1 - 10
  • [9] A Note on a Self-concordant Barrier Function
    Jin, Zhengjing
    Bai, Yanqin
    Zhang, Lipu
    PROCEEDINGS OF THE THIRD INTERNATIONAL WORKSHOP ON APPLIED MATRIX THEORY, 2009, : 279 - 282
  • [10] New Self-Concordant Barrier for the Hypercube
    E. A. Papa Quiroz
    P. R. Oliveira
    Journal of Optimization Theory and Applications, 2007, 135 : 475 - 490