A Method for Transforming Non-Convex Optimization Problem to Distributed Form

被引:1
|
作者
Khamisov, Oleg O. [1 ]
Khamisov, Oleg V. [1 ]
Ganchev, Todor D. [2 ]
Semenkin, Eugene S. [3 ]
机构
[1] Melentiev Energy Syst Inst, Depertment Appl Math, Irkutsk 664033, Russia
[2] Tech Univ Varna, Dept Comp Sci & Engn, Varna 9010, Bulgaria
[3] Baumann Moscow State Tech Univ, Sci & Educ Ctr Artificial Intelligence Technol, Moscow 105005, Russia
关键词
distributed optimization; non-convex optimization; gradient descent; Newton's method; REAL-TIME; DECOMPOSITION; ALGORITHM; DESIGN;
D O I
10.3390/math12172796
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We propose a novel distributed method for non-convex optimization problems with coupling equality and inequality constraints. This method transforms the optimization problem into a specific form to allow distributed implementation of modified gradient descent and Newton's methods so that they operate as if they were distributed. We demonstrate that for the proposed distributed method: (i) communications are significantly less time-consuming than oracle calls, (ii) its convergence rate is equivalent to the convergence of Newton's method concerning oracle calls, and (iii) for the cases when oracle calls are more expensive than communication between agents, the transition from a centralized to a distributed paradigm does not significantly affect computational time. The proposed method is applicable when the objective function is twice differentiable and constraints are differentiable, which holds for a wide range of machine learning methods and optimization setups.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Replica exchange for non-convex optimization
    Dong, Jing
    Tong, Xin T.
    1600, Microtome Publishing (22):
  • [42] Distributed Stochastic Gradient Tracking Algorithm With Variance Reduction for Non-Convex Optimization
    Jiang, Xia
    Zeng, Xianlin
    Sun, Jian
    Chen, Jie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5310 - 5321
  • [43] Accelerated Zeroth-Order Algorithm for Stochastic Distributed Non-Convex Optimization
    Zhang, Shengjun
    Bailey, Colleen P.
    2022 AMERICAN CONTROL CONFERENCE, ACC, 2022, : 4274 - 4279
  • [44] Robust Optimization for Non-Convex Objectives
    Chen, Robert
    Lucier, Brendan
    Singer, Yaron
    Syrgkanis, Vasilis
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [45] CLASS OF NON-CONVEX OPTIMIZATION PROBLEMS
    HIRCHE, J
    TAN, HK
    ZEITSCHRIFT FUR ANGEWANDTE MATHEMATIK UND MECHANIK, 1977, 57 (04): : 247 - 253
  • [46] EXISTENCE THEOREMS IN NON-CONVEX OPTIMIZATION
    AUBERT, G
    TAHRAOUI, R
    APPLICABLE ANALYSIS, 1984, 18 (1-2) : 75 - 100
  • [47] Accelerated algorithms for convex and non-convex optimization on manifolds
    Lin, Lizhen
    Saparbayeva, Bayan
    Zhang, Michael Minyi
    Dunson, David B.
    MACHINE LEARNING, 2025, 114 (03)
  • [48] Trading Redundancy for Communication: Speeding up Distributed SGD for Non-convex Optimization
    Haddadpour, Farzin
    Kamani, Mohammad Mahdi
    Mahdavi, Mehrdad
    Cadambe, Viveck R.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [49] Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization
    Juan Gao
    Xin-Wei Liu
    Yu-Hong Dai
    Yakui Huang
    Junhua Gu
    Computational Optimization and Applications, 2023, 84 : 531 - 572
  • [50] Convex and Non-convex Optimization Under Generalized Smoothness
    Li, Haochuan
    Qian, Jian
    Tian, Yi
    Rakhlin, Alexander
    Jadbabaie, Ali
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,