Energy-saving Service Offloading for the Internet of Medical Things Using Deep Reinforcement Learning

被引:8
|
作者
Jiang, Jielin [1 ,2 ]
Guo, Jiajie [3 ]
Khan, Maqbool [4 ,5 ]
Cui, Yan [6 ]
Lin, Wenmin [7 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Jiangsu Collaborat Innovat Ctr Atmospher Environm, Nanjing, Peoples R China
[2] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Peoples R China
[3] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing, Peoples R China
[4] Software Competence Ctr Hagenberg GmbH, Softwarepk, Austria
[5] SPCAI Pak Austria Fachhochsch, Inst Appl Sci & Technol, Haripur, Pakistan
[6] Nanjing Normal Univ Special Educ, Coll Math & Informat Sci, Nanjing, Peoples R China
[7] Hangzhou Normal Univ, Inst VR & Intelligent Syst, Alibaba Business Sch, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Service offloading; asynchronous advantage actor-critic; internet of medical things; deep reinforcement learning; ARTIFICIAL-INTELLIGENCE; RESOURCE-ALLOCATION; EDGE; CLOUD;
D O I
10.1145/3560265
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As a critical branch of the Internet of Things (IoT) in the medicine industry, the Internet of Medical Things (IoMT) significantly improves the quality of healthcare due to its real-time monitoring and low medical cost. Benefiting from edge and cloud computing, IoMT is provided with more computing and storage resources near the terminal to meet the low-delay requirements of computation-intensive services. However, the service offloading from health monitoring units (HMUs) to edge servers generates additional energy consumption. Fortunately, artificial intelligence (AI), which has developed rapidly in recent years, has proved effective in some resource allocation applications. Taking both energy consumption and delay into account, we propose an energy-aware service offloading algorithm under an end-edge-cloud collaborative IoMT system with Asynchronous Advantage Actor-critic (A3C), named ECAC. Technically, ECAC uses the structural similarity between the natural distributed IoMT system and A3C, whose parameters are asynchronously updated. Besides, due to the typical delay-sensitivity mechanism and time-energy correction, ECAC can adjust dynamically to the diverse service types and system requirements. Finally, the effectiveness of ECAC for IoMT is proved on real data.
引用
下载
收藏
页数:20
相关论文
共 50 条
  • [1] Energy-saving service management technology of internet of things using edge computing and deep learning
    Defeng Li
    Mingming Lan
    Yuan Hu
    Complex & Intelligent Systems, 2022, 8 : 3867 - 3879
  • [2] Energy-saving service management technology of internet of things using edge computing and deep learning
    Li, Defeng
    Lan, Mingming
    Hu, Yuan
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (05) : 3867 - 3879
  • [3] Associative tasks computing offloading scheme in Internet of medical things with deep reinforcement learning
    Fan, Jiang
    Junwei, Qin
    Lei, Liu
    Hui, Tian
    CHINA COMMUNICATIONS, 2024, 21 (04) : 38 - 52
  • [4] Associative Tasks Computing Offloading Scheme in Internet of Medical Things with Deep Reinforcement Learning
    Jiang Fan
    Qin Junwei
    Liu Lei
    Tian Hui
    China Communications, 2024, 21 (04) : 38 - 52
  • [5] Edge QoE: Computation Offloading With Deep Reinforcement Learning for Internet of Things
    Lu, Haodong
    He, Xiaoming
    Du, Miao
    Ruan, Xiukai
    Sun, Yanfei
    Wang, Kun
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (10) : 9255 - 9265
  • [6] Energy-Saving Predictive Video Streaming with Deep Reinforcement Learning
    Liu, Dong
    Zhao, Jianyu
    Yang, Chenyang
    2019 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2019,
  • [7] Energy Conservation for Internet of Things Tracking Applications Using Deep Reinforcement Learning
    Sultan, Salman Md
    Waleed, Muhammad
    Pyun, Jae-Young
    Um, Tai-Won
    SENSORS, 2021, 21 (09)
  • [8] Research on Energy-Saving Routing Technology Based on Deep Reinforcement Learning
    Zheng, Xiangyu
    Huang, Wanwei
    Wang, Sunan
    Zhang, Jianwei
    Zhang, Huanlong
    ELECTRONICS, 2022, 11 (13)
  • [9] Energy-Efficient Computation Offloading Based on Multiagent Deep Reinforcement Learning for Industrial Internet of Things Systems
    Chouikhi, Samira
    Esseghir, Moez
    Merghem-Boulahia, Leila
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (07) : 12228 - 12239
  • [10] Deep Reinforcement Learning Based Computation Offloading in Fog Enabled Industrial Internet of Things
    Ren, Yijing
    Sun, Yaohua
    Peng, Mugen
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (07) : 4978 - 4987