Towards Communication-Efficient Distributed Background Subtraction

被引:0
|
作者
Hung Ngoc Phan [1 ]
Synh Viet-Uyen Ha [2 ]
Phuong Hoai Ha [1 ]
机构
[1] UiT Arctic Univ Norway, Tromso, Norway
[2] Ho Chi Minh Int Univ, Vietnam Natl Univ, Ho Chi Minh City, Vietnam
关键词
Traffic monitoring; Distributed machine learning; Asynchronous computation; Parallel computing; Background subtraction; Change detection;
D O I
10.1007/978-981-19-8234-7_38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Road traffic monitoring is one of the essential components in data analysis for urban air pollution prevention. In road traffic monitoring, background subtraction is a critical approach where moving objects are extracted via facilitating motion information of interest to static surroundings, known as backgrounds. To work with various contextual dynamics of nature scenes, supervised models of background subtraction aim to solve a gradient-based optimization problem on multi-modal sequences of videos by training a convolutional neural network. As video datasets are scaling up, distributing the model learning on multiple processing elements is a pivotal technique to leverage the computational power among various devices. However, one of major challenges in distributed machine learning is communication overhead. This paper introduces a new communication-efficient distributed framework for background subtraction (CEDFrame), alleviating the communication overhead in distributed training with video data. The new framework utilizes event-triggered communication on a ring topology among workers and the Partitioned Globally Address Space (PGAS) paradigm for asynchronous computation. Through the new framework, we investigate how training a background subtraction tolerates the tradeoffs between communication avoidance and accuracy in model learning. The experimental results on NVIDIA DGX-2 using the CDnet-2014 dataset show that the new framework can reduce the communication overhead by at least 94.71% while having a negligible decrement in testing accuracy (at most 2.68%).
引用
收藏
页码:490 / 502
页数:13
相关论文
共 50 条
  • [41] Communication-efficient Distributed Multi-resource Allocation
    Alam, Syed Eqbal
    Shorten, Robert
    Wirth, Fabian
    Yu, Jia Yuan
    2018 IEEE INTERNATIONAL SMART CITIES CONFERENCE (ISC2), 2018,
  • [42] Ordered Gradient Approach for Communication-Efficient Distributed Learning
    Chen, Yicheng
    Sadler, Brian M.
    Blum, Rick S.
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [43] Communication-Efficient Exact Clustering of Distributed Streaming Data
    Tran, Dang-Hoan
    Sattler, Kai-Uwe
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS - ICCSA 2013, PT V, 2013, 7975 : 421 - 436
  • [44] Communication-Efficient and Byzantine-Robust Distributed Learning
    Ghosh, Avishek
    Maity, Raj Kumar
    Kadhe, Swanand
    Mazumdar, Arya
    Ramchandran, Kannan
    2020 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2020,
  • [45] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [46] Adaptive Bit Allocation for Communication-Efficient Distributed Optimization
    Reisizadeh, Hadi
    Touri, Behrouz
    Mohajer, Soheil
    2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 1994 - 2001
  • [47] Communication-efficient distributed precoding design for Massive MIMO
    Li M.
    Li Y.
    Zhang Z.
    Shi Q.
    Tongxin Xuebao/Journal on Communications, 2023, 44 (08): : 37 - 48
  • [48] Communication-Efficient Quantum Algorithm for Distributed Machine Learning
    Tang, Hao
    Li, Boning
    Wang, Guoqing
    Xu, Haowei
    Li, Changhao
    Barr, Ariel
    Cappellaro, Paola
    Li, Ju
    PHYSICAL REVIEW LETTERS, 2023, 130 (15)
  • [49] Communication-efficient distributed AI strategies for the IoT edge
    Mwase, Christine
    Jin, Yi
    Westerlund, Tomi
    Tenhunen, Hannu
    Zou, Zhuo
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 131 : 292 - 308
  • [50] CoCoA: A General Framework for Communication-Efficient Distributed Optimization
    Smith, Virginia
    Forte, Simone
    Ma, Chenxin
    Takac, Martin
    Jordan, Michael I.
    Jaggi, Martin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18