Energy-efficient Incremental Offloading of Neural Network Computations in Mobile Edge Computing

被引:2
|
作者
Guo, Guangfeng [1 ,2 ]
Zhang, Junxing [1 ]
机构
[1] Inner Mongolia Univ, Coll Comp Sci, Hohhot, Peoples R China
[2] Inner Mongolia Univ Sci & Technol, Baotou Teachers Coll, Baotou, Peoples R China
基金
中国国家自然科学基金;
关键词
Mobile Edge Computing; Deep Neural Network; Computation Offloading; Energy Efficient;
D O I
10.1109/GLOBECOM42002.2020.9322504
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep Neural Network (DNN) has shown remarkable success in Computer Vision and Augmented Reality. However, battery-powered devices still cannot afford to run state-of-the-art DNNs. Mobile Edge Computing (MEC) is a promising approach to run the DNNs on energy-constrained mobile devices. It uploads the DNN model partitions of the devices to the nearest edge servers on demand, and then offloads DNN computations to the servers to save the energy of the devices. Nevertheless, the existing all-at-once computation offloading faces two great challenges. The first one is how to find the most energy-efficient model partition scheme under different wireless network bandwidths in MEC. The second challenge is how to reduce the time and energy cost of the devices waiting for the servers, since uploading all DNN layers of the optimal partition often lakes time. To meet these challenges, we propose the following solution. First, we build regression-based energy consumption prediction models by profiling the energy consumption of mobile devices under varied wireless network bandwidths. Then, we present an algorithm that finds the most energy-efficient DNN partition scheme based on the established prediction models and performs incremental computation offloading upon the completion of uploading each DNN partition. The experimental results show that our solution improves energy efficiency compared to the current all-at-once approach. Under the 100 Mbps bandwidth, when the model uploading takes 1/3 of the total uploading time, the proposed solution can reduce the energy consumption by around 40%.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Energy-efficient Computing Offloading Algorithm for Mobile Edge Computing Network
    Zhang, Xiang-Jun
    Wu, Wei-Guo
    Zhang, Chi
    Chai, Yu-Xiang
    Yang, Shi-Yuan
    Wang, Xiong
    [J]. Ruan Jian Xue Bao/Journal of Software, 2023, 34 (02): : 849 - 867
  • [2] Neural Combinatorial Optimization for Energy-Efficient Offloading in Mobile Edge Computing
    Jiang, Qingmiao
    Zhang, Yuan
    Yan, Jinyao
    [J]. IEEE ACCESS, 2020, 8 : 35077 - 35089
  • [3] Energy-efficient Autonomic Offloading in Mobile Edge Computing
    Luo, Changqing
    Salinas, Sergio
    Li, Ming
    Li, Pan
    [J]. 2017 IEEE 15TH INTL CONF ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, 15TH INTL CONF ON PERVASIVE INTELLIGENCE AND COMPUTING, 3RD INTL CONF ON BIG DATA INTELLIGENCE AND COMPUTING AND CYBER SCIENCE AND TECHNOLOGY CONGRESS(DASC/PICOM/DATACOM/CYBERSCI, 2017, : 581 - 588
  • [4] Energy-efficient cooperative offloading for mobile edge computing
    Shi, Wenjun
    Wu, Jigang
    Chen, Long
    Zhang, Xinxiang
    Wu, Huaiguang
    [J]. WIRELESS NETWORKS, 2023, 29 (06) : 2419 - 2435
  • [5] Energy-efficient cooperative offloading for mobile edge computing
    Wenjun Shi
    Jigang Wu
    Long Chen
    Xinxiang Zhang
    Huaiguang Wu
    [J]. Wireless Networks, 2023, 29 : 2419 - 2435
  • [6] Discontinuous Computation Offloading for Energy-Efficient Mobile Edge Computing
    Merluzzi, Mattia
    di Pietro, Nicola
    Di Lorenzo, Paolo
    Strinati, Emilio Calvanese
    Barbarossa, Sergio
    [J]. IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, 2022, 6 (02): : 1242 - 1257
  • [7] Energy-Efficient Offloading in Mobile Edge Computing with Edge-Cloud Collaboration
    Long, Xin
    Wu, Jigang
    Chen, Long
    [J]. ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2018, PT III, 2018, 11336 : 460 - 475
  • [8] IONN: Incremental Offloading of Neural Network Computations from Mobile Devices to Edge Servers
    Jeong, Hyuk-Jin
    Lee, Hyeon-Jae
    Shin, Chang Hyun
    Moon, Soo-Mook
    [J]. PROCEEDINGS OF THE 2018 ACM SYMPOSIUM ON CLOUD COMPUTING (SOCC '18), 2018, : 401 - 411
  • [9] Energy-Efficient NOMA-Based Mobile Edge Computing Offloading
    Pan, Yijin
    Chen, Ming
    Yang, Zhaohui
    Huang, Nuo
    Shikh-Bahaei, Mohammad
    [J]. IEEE COMMUNICATIONS LETTERS, 2019, 23 (02) : 310 - 313
  • [10] Energy-Efficient Computation Offloading in Mobile Edge Computing Systems With Uncertainties
    Ji, Tianxi
    Luo, Changqing
    Yu, Lixing
    Wang, Qianlong
    Chen, Siheng
    Thapa, Arun
    Li, Pan
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (08) : 5717 - 5729