Latency Estimation and Computational Task Offloading in Vehicular Mobile Edge Computing Applications

被引:0
|
作者
Zhang, Wenhan [1 ]
Feng, Mingjie [1 ,2 ]
Krunz, Marwan [1 ]
机构
[1] Univ Arizona, Dept Elect & Comp Engn, Tucson, AZ 85721 USA
[2] Huazhong Univ Sci & Technol, Wuhan Natl Lab Optoelect, Wuhan, Peoples R China
关键词
Vehicle-to-everything(V2X) applications; mobile edge computing; task offloading; latency prediction; Long Short-Term Memory (LSTM); end-to-end (E2E) delay; NETWORKS; KALMAN; MINIMIZATION; ALLOCATION; TRACKING; MODEL;
D O I
10.1109/TVT.2023.3334192
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Mobile edge computing (MEC) is a key enabler of time-critical vehicle-to-everything (V2X) applications. Under MEC, a vehicle has the option to offload computationally intensive tasks to a nearby edge server or to a remote cloud server. Determining where to execute a task necessitates accurate estimation of the end-to-end (E2E) offloading delay. In this paper, we first conduct extensive measurements of the round-trip time (RTT) between a vehicular user and edge/cloud servers. Using these measurements, we present a latency-estimation framework for optimal task offloading. The propagation delay, measured by the RTT, is divided into two components: one that follows a trackable trend (baseline) and the other (residual) that is quasi-random. For the baseline component, we first cluster measured RTTs into several groups, depending on signal strength indicators. For each group, we develop a Long Short-Term Memory (LSTM) regression model. A statistical approach is provided for predicting the residual component, which combines the Epanechnikov Kernel and moving average functions. Predicted propagation delays are incorporated into virtual simulations to estimate the transmission, queuing, and processing delays, hence accounting for the E2E delay. Based on the estimated E2E delay, we design a task offloading scheme that minimizes the offloading latency while maintaining a low packet loss rate. Simulation results show that the proposed offloading strategy can reduce the E2E delay by approximately 60% compared to a random offloading scheme while keeping the packet loss rate below 3%.
引用
收藏
页码:5808 / 5823
页数:16
相关论文
共 50 条
  • [41] Partial Offloading for Latency Minimization in Mobile-Edge Computing
    Ren, Jinke
    Yu, Guanding
    Cai, Yunlong
    He, Yinghui
    Qu, Fengzhong
    [J]. GLOBECOM 2017 - 2017 IEEE GLOBAL COMMUNICATIONS CONFERENCE, 2017,
  • [42] On the Optimality of Task Offloading in Mobile Edge Computing Environments
    Alghamdi, Ibrahim
    Anagnostopoulos, Christos
    Pezaros, Dimitrios P.
    [J]. 2019 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2019,
  • [43] Utility Aware Task Offloading for Mobile Edge Computing
    Bi, Ran
    Ren, Jiankang
    Wang, Hao
    Liu, Qian
    Yang, Xiuyuan
    [J]. WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2019, 2019, 11604 : 547 - 555
  • [44] Task offloading strategies for mobile edge computing: A survey
    Dong, Shi
    Tang, Junxiao
    Abbas, Khushnood
    Hou, Ruizhe
    Kamruzzaman, Joarder
    Rutkowski, Leszek
    Buyya, Rajkumar
    [J]. COMPUTER NETWORKS, 2024, 254
  • [45] Task Offloading Scheduling in Mobile Edge Computing Networks
    Wang, Zhonglun
    Li, Peifeng
    Shen, Shuai
    Yang, Kun
    [J]. 12TH INTERNATIONAL CONFERENCE ON AMBIENT SYSTEMS, NETWORKS AND TECHNOLOGIES (ANT) / THE 4TH INTERNATIONAL CONFERENCE ON EMERGING DATA AND INDUSTRY 4.0 (EDI40) / AFFILIATED WORKSHOPS, 2021, 184 : 322 - 329
  • [46] Latency-Aware Offloading for Mobile Edge Computing Networks
    Feng, Wei
    Liu, Hao
    Yao, Yingbiao
    Cao, Diqiu
    Zhao, Mingxiong
    [J]. IEEE COMMUNICATIONS LETTERS, 2021, 25 (08) : 2673 - 2677
  • [47] Computational Offloading of Service Workflow in Mobile Edge Computing
    Fu, Shuang
    Ding, Chenyang
    Jiang, Peng
    [J]. INFORMATION, 2022, 13 (07)
  • [48] Compound Model of Task Arrivals and Load-Aware Offloading for Vehicular Mobile Edge Computing Networks
    Li, Longjiang
    Zhou, Hongmei
    Xiong, Shawn Xiaoli
    Yang, Jianjun
    Mao, Yuming
    [J]. IEEE ACCESS, 2019, 7 : 26631 - 26640
  • [49] Risk-Sensitive Task Fetching and Offloading for Vehicular Edge Computing
    Batewela, Sadeep
    Liu, Chen-Feng
    Bennis, Mehdi
    Suraweera, Himal A.
    Hong, Choong Seon
    [J]. IEEE COMMUNICATIONS LETTERS, 2020, 24 (03) : 617 - 621
  • [50] TOC: Joint Task Offloading and Computation Reuse in Vehicular Edge Computing
    Li, Kaiyue
    Hu, Shihong
    Tang, Bin
    [J]. ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2023, PT VI, 2024, 14492 : 265 - 282