GIANT: Globally Improved Approximate Newton Method for Distributed Optimization

被引:0
|
作者
Wang, Shusen [1 ]
Roosta-Khorasani, Farbod [2 ]
Xu, Peng [3 ]
Mahoney, MichaelW. [4 ]
机构
[1] Stevens Inst Technol, Hoboken, NJ 07030 USA
[2] Univ Queensland, Brisbane, Qld, Australia
[3] Stanford Univ, Stanford, CA 94305 USA
[4] Univ Calif Berkeley, Berkeley, CA 94720 USA
基金
澳大利亚研究理事会;
关键词
PARALLEL;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For distributed computing environment, we consider the empirical risk minimization problem and propose a distributed and communication-efficient Newton-type optimization method. At every iteration, each worker locally finds an Approximate NewTon (ANT) direction, which is sent to the main driver. The main driver, then, averages all the ANT directions received from workers to form a Globally Improved ANT (GIANT) direction. GIANT is highly communication efficient and naturally exploits the trade-offs between local computations and global communications in that more local computations result in fewer overall rounds of communications. Theoretically, we show that GIANT enjoys an improved convergence rate as compared with first-order methods and existing distributed Newton-type methods. Further, and in sharp contrast with many existing distributed Newton-type methods, as well as popular first-order methods, a highly advantageous practical feature of GIANT is that it only involves one tuning parameter. We conduct large-scale experiments on a computer cluster and, empirically, demonstrate the superior performance of GIANT.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] AN APPROXIMATE NEWTON METHOD FOR DISTRIBUTED OPTIMIZATION
    Mokhtari, Aryan
    Ling, Qing
    Ribeiro, Alejandro
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 2959 - 2963
  • [2] Achieving Globally Superlinear Convergence for Distributed Optimization with Adaptive Newton Method
    Zhang, Jiaqi
    You, Keyou
    Basar, Tamer
    [J]. 2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 2329 - 2334
  • [3] Accelerated Distributed Approximate Newton Method
    Ye, Haishan
    He, Chaoyang
    Chang, Xiangyu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8642 - 8653
  • [4] Communication Efficient Distributed Approximate Newton Method
    Ghosh, Avishek
    Maity, Raj Kumar
    Mazumdar, Arya
    Ramchandran, Kannan
    [J]. 2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2539 - 2544
  • [5] Communication-Efficient Distributed Optimization using an Approximate Newton-type Method
    Shamir, Ohad
    Srebro, Nathan
    Zhang, Tong
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1000 - 1008
  • [6] A Distributed Newton Method for Network Optimization
    Jadbabaie, Ali
    Ozdaglar, Asuman
    Zargham, Michael
    [J]. PROCEEDINGS OF THE 48TH IEEE CONFERENCE ON DECISION AND CONTROL, 2009 HELD JOINTLY WITH THE 2009 28TH CHINESE CONTROL CONFERENCE (CDC/CCC 2009), 2009, : 2736 - 2741
  • [7] A Newton consensus method for distributed optimization
    Guay, Martin
    [J]. IFAC PAPERSONLINE, 2020, 53 (02): : 5417 - 5422
  • [8] Distributed approximate Newton algorithms and weight design for constrained optimization
    Anderson, Tor
    Chang, Chin-Yao
    Martinez, Sonia
    [J]. AUTOMATICA, 2019, 109
  • [9] Weight Design of Distributed Approximate Newton Algorithms for Constrained Optimization
    Anderson, Tor
    Chang, Chin-Yao
    Martinez, Sonia
    [J]. 2017 IEEE CONFERENCE ON CONTROL TECHNOLOGY AND APPLICATIONS (CCTA 2017), 2017, : 632 - 637
  • [10] A globally convergent approximate Newton method for non-convex sparse learning
    Ji, Fanfan
    Shuai, Hui
    Yuan, Xiao-Tong
    [J]. PATTERN RECOGNITION, 2022, 126