Distributed Learning Systems with First-Order Methods

被引:17
|
作者
Liu, Ji [1 ,2 ]
Zhang, Ce [3 ]
机构
[1] Univ Rochester, Rochester, NY 14627 USA
[2] Kuaishou Inc, Beijing, Peoples R China
[3] Swiss Fed Inst Technol, Zurich, Switzerland
来源
FOUNDATIONS AND TRENDS IN DATABASES | 2020年 / 9卷 / 01期
关键词
CONVERGENCE; ALGORITHM;
D O I
10.1561/1900000062
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Scalable and efficient distributed learning is one of the main driving forces behind the recent rapid advancement of machine learning and artificial intelligence. One prominent feature of this topic is that recent progress has been made by researchers in two communities: (1) the system community such as database, data management, and distributed systems, and (2) the machine learning and mathematical optimization community. The interaction and knowledge sharing between these two communities has led to the rapid development of new distributed learning systems and theory. In this monograph, we hope to provide a brief introduction of some distributed learning techniques that have recently been developed, namely lossy communication compression (e.g., quantization and sparsification), asynchronous communication, and decentralized communication. One special focus in this monograph is on making sure that it can be easily understood by researchers in both communities on the system side, we rely on a simplified system model hiding many system details that are not necessary for the intuition behind the system speedups; while, on the theory side, we rely on minimal assumptions and significantly simplify the proof of some recent work to achieve comparable results.
引用
下载
收藏
页码:1 / 100
页数:100
相关论文
共 50 条