The Log-Exponential Smoothing Technique and Nesterov's Accelerated Gradient Method for Generalized Sylvester Problems

被引:6
|
作者
Nguyen Thai An [1 ]
Giles, Daniel [2 ]
Nguyen Mau Nam [2 ]
Rector, R. Blake [2 ]
机构
[1] Thua Thien Hue Coll Educ, 123 Nguyen Hue, Hue City, Vietnam
[2] Portland State Univ, Fariborz Maseeh Dept Math & Stat, POB 751, Portland, OR 97207 USA
基金
美国国家科学基金会;
关键词
Log-exponential smoothing technique; Majorization minimization algorithm; Nesterov's accelerated gradient method; Generalized Sylvester problem; SMALLEST ENCLOSING BALL; ALGORITHMS;
D O I
10.1007/s10957-015-0811-z
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The Sylvester or smallest enclosing circle problem involves finding the smallest circle enclosing a finite number of points in the plane. We consider generalized versions of the Sylvester problem in which the points are replaced by sets. Based on the log-exponential smoothing technique and Nesterov's accelerated gradient method, we present an effective numerical algorithm for solving these problems.
引用
下载
收藏
页码:559 / 583
页数:25
相关论文
共 25 条
  • [1] The Log-Exponential Smoothing Technique and Nesterov’s Accelerated Gradient Method for Generalized Sylvester Problems
    Nguyen Thai An
    Daniel Giles
    Nguyen Mau Nam
    R. Blake Rector
    Journal of Optimization Theory and Applications, 2016, 168 : 559 - 583
  • [2] A log-exponential smoothing method for mathematical programs with complementarity constraints
    Li, Yanyan
    Tan, Tao
    Li, Xingsi
    APPLIED MATHEMATICS AND COMPUTATION, 2012, 218 (10) : 5900 - 5909
  • [3] A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA
    Florea, Mihai I.
    Vorobyov, Sergiy A.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 (68) : 3033 - 3048
  • [4] NONSMOOTH ALGORITHMS AND NESTEROV'S SMOOTHING TECHNIQUE FOR GENERALIZED FERMAT-TORRICELLI PROBLEMS
    Nguyen Mau Nam
    Nguyen Thai An
    Rector, R. Blake
    Sun, Jie
    SIAM JOURNAL ON OPTIMIZATION, 2014, 24 (04) : 1815 - 1839
  • [5] On the Convergence of Nesterov's Accelerated Gradient Method in Stochastic Settings
    Assran, Mahmoud
    Rabbat, Michael
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [6] A Stochastic Quasi-Newton Method with Nesterov's Accelerated Gradient
    Indrapriyadarsini, S.
    Mahboubi, Shahrzad
    Ninomiya, Hiroshi
    Asai, Hideki
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT I, 2020, 11906 : 743 - 760
  • [7] Uniting Nesterov's Accelerated Gradient Descent and the Heavy Ball Method for Strongly Convex Functions with Exponential Convergence Rate
    Hustig-Schultz, Dawn M.
    Sanfelice, Ricardo G.
    2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 959 - 964
  • [8] Nesterov's accelerated gradient method for nonlinear ill-posed problems with a locally convex residual functional
    Hubmer, Simon
    Ramlau, Ronny
    INVERSE PROBLEMS, 2018, 34 (09)
  • [9] Improving Neural Ordinary Differential Equations with Nesterov's Accelerated Gradient Method
    Nguyen, Nghia H.
    Nguyen, Tan M.
    Vo, Huyen K.
    Osher, Stanley J.
    Vo, Thieu N.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
    Su, Weijie
    Boyd, Stephen
    Candes, Emmanuel J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27