Real-time batch processing at a GPU-based edge with a passive optical network

被引:0
|
作者
Onodera, Yukito [1 ]
Inoue, Yoshiaki [3 ]
Hisano, Daisuke [4 ]
Yoshimoto, Naoto [5 ]
Nakayama, Yu [2 ]
机构
[1] Tokyo Univ Agr & Technol, Tokyo, Japan
[2] Tokyo Univ Agr & Technol, Inst Engn, Tokyo, Japan
[3] Osaka Univ, Commun Engn, Osaka, Japan
[4] Osaka Univ, Elect Elect & Informat Engn, Osaka, Japan
[5] Chitose Inst Sci & Technol, Chitose, Hokkaido, Japan
关键词
Delays; Cameras; Real-time systems; Servers; Resource management; Optical network units; Low latency communication; BANDWIDTH ALLOCATION; DYNAMIC BANDWIDTH; WAVELENGTH; URLLC;
D O I
10.1364/JOCN.476116
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, advances in deep learning technology have significantly improved the research and services relating to artificial intelligence. Real-time object recognition is an important technique in smart cities, in which low-cost network deployment and low-latency data transfer are key technologies. In this study, we focus on time- and wavelength-division multiplexed passive optical network (TWDM-PON)-based inference systems to deploy cost-efficient networks that accommodate several network cameras. A significant issue for a graphics processing unit (GPU)-based inference system via a TWDM-PON is the optimal allocation of the upstream wavelength and bandwidth to enable real-time inference. However, an increase in the batch size of the arrival data at the edge servers, thereby ensuring low-latency transmission, has not been considered previously. Therefore, this study proposes the concept of an inference system in which a large number of cameras periodically upload image data to a GPU-based server via the TWDM-PON. Moreover, we propose a cooperative wavelength and bandwidth allocation algorithm to ensure low-latency and time-synchronized data arrivals at the edge. The performance of the proposed scheme is verified through a computer simulation.
引用
收藏
页码:404 / 414
页数:11
相关论文
共 50 条
  • [1] Real-Time Resource Allocation in Passive Optical Network for Energy-Efficient Inference at GPU-Based Network Edge
    Nakayama, Yu
    Nguyen, Anh Hoang Ngoc
    Hara-Azumi, Yuko
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (18): : 17348 - 17358
  • [2] Real-Time and Energy-Efficient Inference at GPU-Based Network Edge using PON
    Onodera, Yukito
    Inoue, Yoshiaki
    Hisano, Daisuke
    Nakayama, Yu
    2021 IEEE 18TH ANNUAL CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE (CCNC), 2021,
  • [3] GPU-based real-time crowd rendering
    Zhang, Yong
    Yin, Bao-Cai
    Kong, De-Hui
    Yang, Guang-Wei
    Beijing Gongye Daxue Xuebao / Journal of Beijing University of Technology, 2009, 35 (10): : 1434 - 1440
  • [4] GPU-based Real-time Face Detector
    Jeong, Jae-chan
    Shin, Ho-chul
    Cho, Jae-il
    2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, : 173 - 175
  • [5] A GPU-BASED SOFT REAL-TIME SYSTEM FOR SIMULTANEOUS EEG PROCESSING AND VISUALIZATION
    Juhasz, Zoltan
    Kozmann, Gyorgy
    SCALABLE COMPUTING-PRACTICE AND EXPERIENCE, 2016, 17 (02): : 61 - 78
  • [6] GPU-Based Real-Time Software Coincidence Processing for Digital PET System
    Shi, Yu
    Meng, Fanzhen
    Zhou, Jianwei
    Li, Lei
    Li, Juntao
    Zhu, Shouping
    IEEE TRANSACTIONS ON RADIATION AND PLASMA MEDICAL SCIENCES, 2022, 6 (06) : 707 - 720
  • [7] GPU-based real-time deformation with normal reconstruction
    Che, Yinghui
    Wang, Jing
    Liang, Xiaohui
    TECHNOLOGIES FOR E-LEARNING AND DIGITAL ENTERTAINMENT, PROCEEDINGS, 2007, 4469 : 667 - +
  • [8] Real-time GPU-based simulation of dynamic terrain
    Aquilio, Anthony S.
    Brooks, Jeremy C.
    Zhu, Ying
    Owen, G. Scott
    ADVANCES IN VISUAL COMPUTING, PT 1, 2006, 4291 : 891 - +
  • [9] GPU-based real-time RGBD data filtering
    Abdenour Amamra
    Nabil Aouf
    Journal of Real-Time Image Processing, 2018, 14 : 323 - 340
  • [10] GPU-based real-time RGBD data filtering
    Amamra, Abdenour
    Aouf, Nabil
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2018, 14 (02) : 323 - 340