ASPM: Reliability-Oriented DNN Inference Partition and Offloading in Vehicular Edge Computing

被引:0
|
作者
Yan, Guozhi [1 ]
Liu, Chunhui [1 ]
Liu, Kai [1 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
基金
中国国家自然科学基金;
关键词
Vehicular Edge Computing; DNN Inference; Partition; Reliable Task Offloading;
D O I
10.1109/ITSC57777.2023.10422172
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent years have witnessed a surge in the deployment of Deep Neural Network (DNN)-based services, which drives the development of emerging intelligent transportation systems (ITSs). However, it is still challenging to enable efficient and reliable DNN inference in Vehicular Edge Computing (VEC) environments due to resource constraints and system dynamics. In view of this, this work investigates a DNN inference partition and offloading scenario with environmental uncertainties in VEC, which motivates the necessity to strike a balance between inference delay and the success ratio of receiving the offloading outputs. Then, by considering communication and computation overheads as well as failed offloading conditions in an analytical model, we propose an Adaptive Splitting, Partitioning, and Merging (ASPM) strategy that reduces the inference delay while maintaining a decent offloading success ratio. Specifically, ASPM first splits and partitions the DNN model in a recursive way to find the optimal split blocks with the aim of minimizing inference delay. On this basis, it further merges DNN blocks in a greedy way to reduce the number of blocks to be offloaded, thus, enhancing the offloading success ratio for the whole DNN inference. Finally, we conduct comprehensive performance evaluations to demonstrate the superiority of our design.
引用
收藏
页码:3298 / 3303
页数:6
相关论文
共 50 条
  • [21] An efficient task offloading scheme in vehicular edge computing
    Salman Raza
    Wei Liu
    Manzoor Ahmed
    Muhammad Rizwan Anwar
    Muhammad Ayzed Mirza
    Qibo Sun
    Shangguang Wang
    Journal of Cloud Computing, 9
  • [22] A Collaborative Task Offloading Scheme in Vehicular Edge Computing
    Bute, Muhammad Saleh
    Fan, Pingzhi
    Liu, Gang
    Abbas, Fakhar
    Ding, Zhiguo
    2021 IEEE 93RD VEHICULAR TECHNOLOGY CONFERENCE (VTC2021-SPRING), 2021,
  • [23] A DNN inference acceleration algorithm combining model partition and task allocation in heterogeneous edge computing system
    Lei Shi
    Zhigang Xu
    Yabo Sun
    Yi Shi
    Yuqi Fan
    Xu Ding
    Peer-to-Peer Networking and Applications, 2021, 14 : 4031 - 4045
  • [24] A DNN inference acceleration algorithm combining model partition and task allocation in heterogeneous edge computing system
    Shi, Lei
    Xu, Zhigang
    Sun, Yabo
    Shi, Yi
    Fan, Yuqi
    Ding, Xu
    PEER-TO-PEER NETWORKING AND APPLICATIONS, 2021, 14 (06) : 4031 - 4045
  • [25] Elastic DNN Inference With Unpredictable Exit in Edge Computing
    Huang, Jiaming
    Gao, Yi
    Dong, Wei
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (12) : 14005 - 14016
  • [26] Virtual Edge: Exploring Computation Offloading in Collaborative Vehicular Edge Computing
    Cha, Narisu
    Wu, Celimuge
    Yoshinaga, Tsutomu
    Ji, Yusheng
    Yau, Kok-Lim Alvin
    IEEE ACCESS, 2021, 9 : 37739 - 37751
  • [27] Elastic DNN Inference with Unpredictable Exit in Edge Computing
    Huang, Jiaming
    Gao, Yi
    Dong, Wei
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 293 - 304
  • [28] Dynamic Edge Server Placement for Computation Offloading in Vehicular Edge Computing
    Nakrani, Dhruv
    Khuman, Jayesh
    Yadav, Ram Narayan
    2023 INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN, 2023, : 45 - 50
  • [29] Task offloading for vehicular edge computing with edge-cloud cooperation
    Fei Dai
    Guozhi Liu
    Qi Mo
    WeiHeng Xu
    Bi Huang
    World Wide Web, 2022, 25 : 1999 - 2017
  • [30] Correction to: Task offloading for vehicular edge computing with edge‑cloud cooperation
    Fei Dai
    Guozhi Liu
    Qi Mo
    WeiHeng Xu
    Bi Huang
    World Wide Web, 2023, 26 : 633 - 633