Towards Communication-Efficient Distributed Background Subtraction

被引:0
|
作者
Hung Ngoc Phan [1 ]
Synh Viet-Uyen Ha [2 ]
Phuong Hoai Ha [1 ]
机构
[1] UiT Arctic Univ Norway, Tromso, Norway
[2] Ho Chi Minh Int Univ, Vietnam Natl Univ, Ho Chi Minh City, Vietnam
关键词
Traffic monitoring; Distributed machine learning; Asynchronous computation; Parallel computing; Background subtraction; Change detection;
D O I
10.1007/978-981-19-8234-7_38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Road traffic monitoring is one of the essential components in data analysis for urban air pollution prevention. In road traffic monitoring, background subtraction is a critical approach where moving objects are extracted via facilitating motion information of interest to static surroundings, known as backgrounds. To work with various contextual dynamics of nature scenes, supervised models of background subtraction aim to solve a gradient-based optimization problem on multi-modal sequences of videos by training a convolutional neural network. As video datasets are scaling up, distributing the model learning on multiple processing elements is a pivotal technique to leverage the computational power among various devices. However, one of major challenges in distributed machine learning is communication overhead. This paper introduces a new communication-efficient distributed framework for background subtraction (CEDFrame), alleviating the communication overhead in distributed training with video data. The new framework utilizes event-triggered communication on a ring topology among workers and the Partitioned Globally Address Space (PGAS) paradigm for asynchronous computation. Through the new framework, we investigate how training a background subtraction tolerates the tradeoffs between communication avoidance and accuracy in model learning. The experimental results on NVIDIA DGX-2 using the CDnet-2014 dataset show that the new framework can reduce the communication overhead by at least 94.71% while having a negligible decrement in testing accuracy (at most 2.68%).
引用
收藏
页码:490 / 502
页数:13
相关论文
共 50 条
  • [21] Communication-efficient Conformal Prediction for Distributed Datasets
    Riquelme-Granada, Nery
    Luo, Zhiyuan
    Khuong An Nguyen
    CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179, 2022, 179
  • [22] Communication-efficient estimation for distributed subset selection
    Chen, Yan
    Dong, Ruipeng
    Wen, Canhong
    STATISTICS AND COMPUTING, 2023, 33 (06)
  • [23] Harvesting Curvatures for Communication-Efficient Distributed Optimization
    Cardoso, Diogo
    Li, Boyue
    Chi, Yuejie
    Xavier, Joao
    2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 749 - 753
  • [24] Communication-efficient distributed mining of association rules
    Schuster, A
    Wolff, R
    DATA MINING AND KNOWLEDGE DISCOVERY, 2004, 8 (02) : 171 - 196
  • [25] Communication-Efficient Distributed SGD With Compressed Sensing
    Tang, Yujie
    Ramanathan, Vikram
    Zhang, Junshan
    Li, Na
    IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 2054 - 2059
  • [26] Communication-Efficient Computation on Distributed Noisy Datasets
    Zhang, Qin
    SPAA'15: PROCEEDINGS OF THE 27TH ACM SYMPOSIUM ON PARALLELISM IN ALGORITHMS AND ARCHITECTURES, 2015, : 313 - 322
  • [27] CREDO : A Communication-Efficient Distributed Estimation Algorithm
    Sahu, Anit Kumar
    Jakovetic, Dusan
    Kar, Soummya
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 516 - 520
  • [28] Communication-efficient estimation for distributed subset selection
    Yan Chen
    Ruipeng Dong
    Canhong Wen
    Statistics and Computing, 2023, 33
  • [29] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    Information Sciences, 2024, 668
  • [30] Communication-efficient distributed covariance sketch, with application to distributed PCA
    Huang, Zengfeng
    Lin, Xuemin
    Zhang, Wenjie
    Zhang, Ying
    Journal of Machine Learning Research, 2021, 22