Fast Training Methods for Stochastic Compositional Optimization Problems

被引:0
|
作者
Gao, Hongchang [1 ]
Huang, Heng [2 ]
机构
[1] Temple Univ, Dept Comp & Informat Sci, Philadelphia, PA 19122 USA
[2] Univ Pittsburgh, Dept Elect & Comp Engn, Pittsburgh, PA 15260 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The stochastic compositional optimization problem covers a wide range of machine learning models, such as sparse additive models and model-agnostic meta-learning. Thus, it is necessary to develop efficient methods for its optimization. Existing methods for the stochastic compositional optimization problem only focus on the single machine scenario, which is far from satisfactory when data are distributed on different devices. To address this problem, we propose novel decentralized stochastic compositional gradient descent methods to efficiently train the large-scale stochastic compositional optimization problem. To the best of our knowledge, our work is the first one facilitating decentralized training for this kind of problem. Furthermore, we provide the convergence analysis for our methods, which shows that the convergence rate of our methods can achieve linear speedup with respect to the number of devices. At last, we apply our decentralized training methods to the model-agnostic meta-learning problem, and the experimental results confirm the superior performance of our methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Numerical methods for distributed stochastic compositional optimization problems with aggregative structure
    Zhao, Shengchao
    Liu, Yongchao
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2024,
  • [2] Distributed stochastic compositional optimization problems over directed networks
    Zhao, Shengchao
    Liu, Yongchao
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 87 (01) : 249 - 288
  • [3] Distributed stochastic compositional optimization problems over directed networks
    Shengchao Zhao
    Yongchao Liu
    [J]. Computational Optimization and Applications, 2024, 87 : 249 - 288
  • [4] Faster Stochastic Variance Reduction Methods for Compositional MiniMax Optimization
    Liu, Jin
    Pan, Xiaokang
    Duan, Junwen
    Li, Hong-Dong
    Li, Youqi
    Qu, Zhe
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13927 - 13935
  • [5] Stochastic methods for optimization of crash and NVH problems
    Duddeck, F
    Heiserer, D
    Lescheticky, J
    [J]. COMPUTATIONAL FLUID AND SOLID MECHANICS 2003, VOLS 1 AND 2, PROCEEDINGS, 2003, : 2265 - 2268
  • [6] METHODS OF LOCAL IMPROVEMENT IN STOCHASTIC OPTIMIZATION PROBLEMS
    TSYPKIN, YZ
    KAPLINSKIY, AI
    KRASNENKER, AS
    [J]. ENGINEERING CYBERNETICS, 1973, 11 (06): : 889 - 897
  • [7] Sensitivity analysis for optimization problems solved by stochastic methods
    Takahashi, RHC
    Ramírez, JA
    Vasconcelos, JA
    Saldanha, RR
    [J]. IEEE TRANSACTIONS ON MAGNETICS, 2001, 37 (05) : 3566 - 3569
  • [8] APPLICATION OF DETERMINISTIC OPTIMIZATION METHODS TO STOCHASTIC CONTROL PROBLEMS
    KRAMER, LC
    ATHANS, M
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1974, AC19 (01) : 22 - 30
  • [9] STOCHASTIC METHODS FOR COMPOSITE AND WEAKLY CONVEX OPTIMIZATION PROBLEMS
    Duchi, John C.
    Ruan, Feng
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (04) : 3229 - 3259
  • [10] Nonsmooth-optimization methods in problems of stochastic programming
    Shor, NZ
    Bardadym, TA
    Zhurbenko, NG
    Lykhovid, AP
    Stetsyuk, PI
    [J]. CYBERNETICS AND SYSTEMS ANALYSIS, 1999, 35 (05) : 708 - 720