FedDANE: A Federated Newton-Type Method

被引:0
|
作者
Li, Tian [1 ]
Sahu, Anit Kumar [2 ]
Zaheer, Manzil [3 ]
Sanjabi, Maziar [4 ]
Talwalkar, Ameet [1 ,5 ]
Smith, Virginia [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[2] Bosch Ctr AI, Renningen, Germany
[3] Google Res, Mountain View, CA USA
[4] Univ Southern Calif, Los Angeles, CA 90007 USA
[5] Determined AI, San Francisco, CA USA
关键词
D O I
10.1109/ieeeconf44664.2019.9049023
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning aims to jointly learn statistical models over massively distributed remote devices. In this work, we propose FedDANE, an optimization method that we adapt from DANE [8, 9], a method for classical distributed optimization, to handle the practical constraints of federated learning. We provide convergence guarantees for this method when learning over both convex and non-convex functions. Despite encouraging theoretical results, we find that the method has underwhelming performance empirically. In particular, through empirical simulations on both synthetic and real-world datasets, FedDANE consistently underperforms baselines of FedAvg [7] and FedProx [4] in realistic federated settings. We identify low device participation and statistical device heterogeneity as two underlying causes of this underwhelming performance, and conclude by suggesting several directions of future work.
引用
收藏
页码:1227 / 1231
页数:5
相关论文
共 50 条
  • [1] DONE: Distributed Approximate Newton-type Method for Federated Edge Learning
    Dinh, Canh T.
    Tran, Nguyen H.
    Nguyen, Tuan Dung
    Bao, Wei
    Balef, Amir Rezaei
    Zhou, Bing B.
    Zomaya, Albert Y.
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2648 - 2660
  • [2] On the DSM Newton-type method
    Ramm, A. G.
    JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2012, 38 (1-2) : 523 - 533
  • [3] On the DSM Newton-type method
    A. G. Ramm
    Journal of Applied Mathematics and Computing, 2012, 38 (1-2) : 523 - 533
  • [4] FedNS: A Fast Sketching Newton-Type Algorithm for Federated Learning
    Li, Jian
    Liu, Yong
    Wang, Weiping
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13509 - 13517
  • [5] FedNL: Making Newton-Type Methods Applicable to Federated Learning
    Safaryan, Mher
    Islamov, Rustem
    Qian, Xun
    Richtarik, Peter
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 18959 - 19010
  • [6] Newton-type multilevel optimization method
    Ho, Chin Pang
    Kocvara, Michal
    Parpas, Panos
    OPTIMIZATION METHODS & SOFTWARE, 2022, 37 (01): : 45 - 78
  • [7] On the convergence of an inexact Newton-type method
    Zhou, Guanglu
    Qi, Liqun
    OPERATIONS RESEARCH LETTERS, 2006, 34 (06) : 647 - 652
  • [8] FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning
    Elgabli, Anis
    Issaid, Chaouki B.
    Bedi, Amrit S.
    Rajawat, Ketan
    Bennis, Mehdi
    Aggarwal, Vaneet
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [9] DINO: Distributed Newton-Type Optimization Method
    Crane, Rixon
    Roosta, Fred
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [10] A continuous Newton-type method for unconstrained optimization
    Zhang, Lei-Hong
    Kelley, C. T.
    Liao, Li-Zhi
    PACIFIC JOURNAL OF OPTIMIZATION, 2008, 4 (02): : 259 - 277