UPOA: A User Preference Based Latency and Energy Aware Intelligent Offloading Approach for Cloud-Edge Systems

被引:1
|
作者
Yuan, Jingling [1 ]
Xiang, Yao [1 ]
Deng, Yuhui [2 ]
Zhou, Yi [3 ]
Min, Geyong [4 ]
机构
[1] Wuhan Univ Technol, Sch Comp Sci & Technol, Wuhan 430062, Peoples R China
[2] Jinan Univ, Dept Comp Sci, Guangzhou 510632, Peoples R China
[3] Columbus State Univ, TSYS Sch Comp Sci, Columbus, GA 31097 USA
[4] Univ Exeter, Coll Engn Math & Phys Sci, Exeter EX4 4QF, Devon, England
基金
中国国家自然科学基金;
关键词
Task analysis; Batteries; Energy consumption; Prediction algorithms; Anxiety disorders; Low latency communication; Computational modeling; Cloud-edge system; task offloading; user preference; artificial intelligence; low battery anxiety; NETWORKS;
D O I
10.1109/TCC.2022.3193709
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Task offloading has been widely used to extend the battery life of intelligent mobile devices. Existing task offloading approaches, focusing on perfecting the balance between latency and energy consumption, completely ignore the impacts of the user preference caused by low battery anxiety. The existence of low battery anxiety - mobile users' common fear of losing battery energy, especially when the battery energy is already low - causes users to trade high latency for prolonged battery life. Taking into account the user preference impacts on task offloading, we propose a novel offloading approach called UPOA to obtain refined offloading policies between low latency and energy consumption based on user preferences. In UPOA, we start the study by defining a user preference rule that determines users' offloading preferences according to battery energy status. Then, we build a fine-grained task offloading model to delineate the task distribution characteristics of each node in its offloading link. Guided by this model, we develop a task prediction algorithm based on the long-short-term-memory neural network model to provide task predictions that facilitate offloading policies. Lastly, we implement a particle-swarm-optimization-based online offloading algorithm. The offloading algorithm provides the best long-term offloading policies by incorporating the user preference determined by our user preference rule and the task predictions generated by our task prediction algorithm. To quantitatively evaluate the performance of UPOA, we conduct extensive experiments in a real-world cloud-edge environment. We compare UPOA with three state-of-the-art offloading approaches, DRA, DRL-E2D, and MUDRL under various conditions. Experimental results demonstrate that UPOA can make effective policies based on user preferences compared with the existing approaches. UPOA reduces average latency by 12.49% when battery energy is sufficient and extends battery life by 20.14% when battery energy is low.
引用
收藏
页码:2188 / 2203
页数:16
相关论文
共 50 条
  • [1] User Preference-Based Hierarchical Offloading for Collaborative Cloud-Edge Computing
    Tian, Shujuan
    Chang, Chi
    Long, Saiqin
    Oh, Sangyoon
    Li, Zhetao
    Long, Jun
    [J]. IEEE TRANSACTIONS ON SERVICES COMPUTING, 2023, 16 (01) : 684 - 697
  • [2] Latency-aware Scheduling in the Cloud-Edge Continuum
    Chiaro, Cristopher
    Monaco, Doriana
    Sacco, Alessio
    Casetti, Claudio
    Marchetto, Guido
    [J]. PROCEEDINGS OF 2024 IEEE/IFIP NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM, NOMS 2024, 2024,
  • [3] Primal-Dual-Based Computation Offloading Method for Energy-Aware Cloud-Edge Collaboration
    Su, Qian
    Zhang, Qinghui
    Li, Weidong
    Zhang, Xuejie
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (02) : 1534 - 1549
  • [4] iTaskOffloading: Intelligent Task Offloading for a Cloud-Edge Collaborative System
    Hao, Yixue
    Jiang, Yingying
    Chen, Tao
    Cao, Donggang
    Chen, Min
    [J]. IEEE NETWORK, 2019, 33 (05): : 82 - 88
  • [5] Energy-Efficient Offloading for DNN-Based Smart IoT Systems in Cloud-Edge Environments
    Chen, Xing
    Zhang, Jianshan
    Lin, Bing
    Chen, Zheyi
    Wolter, Katinka
    Min, Geyong
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (03) : 683 - 697
  • [6] Latency-Aware Deployment of IoT Services in a Cloud-Edge Environment
    Zhang, Shouli
    Liu, Chen
    Wang, Jianwu
    Yang, Zhongguo
    Han, Yanbo
    Li, Xiaohong
    [J]. SERVICE-ORIENTED COMPUTING (ICSOC 2019), 2019, 11895 : 231 - 236
  • [7] Cloud-edge collaboration-based task offloading strategy in railway IoT for intelligent detection
    Guo, Qichang
    Xu, Zhanyue
    Yuan, Jiabin
    Wei, Yifei
    [J]. WIRELESS NETWORKS, 2024,
  • [8] Cost-minimized User Association and Partial Offloading for Dependent Tasks in Hybrid Cloud-edge Systems
    Yuan, Haitao
    Hu, Qinglong
    Wang, Meijia
    Bi, Jing
    Zhou, MengChu
    [J]. 2022 IEEE 18TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2022, : 1059 - 1064
  • [9] Low-latency partial resource offloading in cloud-edge elastic optical networks
    Chen, Bowen
    Liu, Ling
    Fan, Yuexuan
    Shao, Weidong
    Gao, Mingyi
    Chen, Hong
    Ju, Weiguo
    Ho, Pin-Han
    Jue, Jason P.
    Shen, Gangxiang
    [J]. JOURNAL OF OPTICAL COMMUNICATIONS AND NETWORKING, 2024, 16 (02) : 142 - 158
  • [10] Energy-Aware Cloud-Edge Collaborative Task Offloading with Adjustable Base Station Radii in Smart Cities
    Su, Qian
    Zhang, Qinghui
    Zhang, Xuejie
    [J]. MATHEMATICS, 2022, 10 (21)