Intelligent Decision-Based Edge Server Sleep for Green Computing in MEC-Enabled IoV Networks

被引:0
|
作者
Hou, Peng [1 ]
Huang, Yi [1 ]
Zhu, Hongbin [2 ]
Lu, Zhihui [1 ]
Huang, Shin-Chia [3 ]
Chai, Hongfeng [2 ]
机构
[1] Fudan Univ, Sch Comp Sci, Shanghai 200438, Peoples R China
[2] Fudan Univ, Inst Fintech, Shanghai 200438, Peoples R China
[3] Natl Taipei Univ Technol, Dept Elect Engn, Taipei 10608, Taiwan
来源
关键词
Edge computing; intelligent vehicle; Internet of vehicles; reinforcement learning; server sleep; PLACEMENT;
D O I
10.1109/TIV.2023.3347833
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the advent of the 6G era, addressing the increasing demand for mobile computing has become crucial. In the next-generation Internet of Vehicles (IoVs), widespread deployment of edge servers ensures low delay and high reliability for vehicular applications. However, densely deployed edge servers are designed to accommodate peak traffic. This results in unnecessary resource wastage, high maintenance costs, and energy consumption during off-peak hours. Enabling the sleep function of edge servers and dynamically controlling their operating mode is crucial for reducing energy consumption and enabling green IoVs. In this paper, we study the dynamic sleep problem of edge servers in the IoV network and propose traffic-aware intelligent sleep decision-making algorithms based on Deep Reinforcement Learning (DRL). We propose a Centralized DRL-based Dynamic Sleep (CDDS) algorithm, which leverages the deep deterministic policy gradient algorithm to learn the optimal decision policy through environmental interaction. To enhance the stability of agent learning and mitigate the impact of environmental changes, we propose a baseline transformation strategy based on the greedy algorithm. Additionally, to overcome the limitations of CDDS, we combine federated learning with DRL and propose a Federated DRL-based Dynamic Sleep (FDDS) algorithm. This approach speeds up model training and improves the model's generalization ability. Furthermore, we conduct extensive experimental verification using real-world datasets. The experimental results demonstrate that both CDDS and FDDS successfully learn the optimal sleep control policy, leading to system cost reduction of 11.06% to 45.36% and 8.71% to 43.77%, respectively, compared to the baselines.
引用
收藏
页码:3687 / 3703
页数:17
相关论文
共 50 条
  • [21] A Hybrid DRL-Based Adaptive Traffic Matching Strategy for Transmitting and Computing in MEC-Enabled IIoT
    Chen, Guang
    Chen, Yueyun
    Du, Jiadong
    Du, Liping
    Mai, Zhiyuan
    Hao, Conghui
    [J]. IEEE COMMUNICATIONS LETTERS, 2024, 28 (01) : 238 - 242
  • [22] Joint offloading strategy based on quantum particle swarm optimization for MEC-enabled vehicular networks
    Shu, Wanneng
    Li, Yan
    [J]. DIGITAL COMMUNICATIONS AND NETWORKS, 2023, 9 (01) : 56 - 66
  • [23] Joint offloading strategy based on quantum particle swarm optimization for MEC-enabled vehicular networks
    Wanneng Shu
    Yan Li
    [J]. Digital Communications and Networks, 2023, 9 (01) : 56 - 66
  • [24] An improved Henry gas optimization algorithm for joint mining decision and resource allocation in a MEC-enabled blockchain networks
    Hussien, Reda M. M.
    Abohany, Amr A. A.
    Moustafa, Nour
    Sallam, Karam M. M.
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (25): : 18665 - 18680
  • [25] Learning-Based Sensing and Computing Decision for Data Freshness in Edge Computing-Enabled Networks
    Yun, Sinwoong
    Kim, Dongsun
    Park, Chanwon
    Lee, Jemin
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (09) : 11386 - 11400
  • [26] Quality-of-Experience-Aware Computation Offloading in MEC-Enabled Blockchain-Based IoT Networks
    Hosseinpour, Mahsa
    Moghaddam, Mohammad Hossein Yaghmaee
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (08): : 14483 - 14493
  • [27] Learning-Aided Dynamic Access Control in MEC-Enabled Green IoT Networks: A Convolutional Reinforcement Learning Approach
    Xu, Lijuan
    Qin, Meng
    Yang, Qinghai
    Kwak, Kyung-Sup
    [J]. IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (02) : 2098 - 2109
  • [28] Deep Reinforcement Learning-based Task Offloading and Resource Allocation in MEC-enabled Wireless Networks
    Engidayehu, Seble Birhanu
    Mahboob, Tahira
    Chung, Min Young
    [J]. 2022 27TH ASIA PACIFIC CONFERENCE ON COMMUNICATIONS (APCC 2022): CREATING INNOVATIVE COMMUNICATION TECHNOLOGIES FOR POST-PANDEMIC ERA, 2022, : 226 - 230
  • [29] Resource Trading in Edge Computing-Enabled IoV: An Efficient Futures-Based Approach
    Liwang, Minghui
    Chen, Ruitao
    Wang, Xianbin
    [J]. IEEE TRANSACTIONS ON SERVICES COMPUTING, 2022, 15 (05) : 2994 - 3007
  • [30] Intelligent Resource Allocation in UAV-Enabled Mobile Edge Computing Networks
    Wang, Meng
    Shi, Shuo
    Gu, Shushi
    Zhang, Ning
    Gu, Xuemai
    [J]. 2020 IEEE 92ND VEHICULAR TECHNOLOGY CONFERENCE (VTC2020-FALL), 2020,