Resilient Load Restoration in Microgrids Considering Mobile Energy Storage Fleets: A Deep Reinforcement Learning Approach

被引:11
|
作者
Yao, Shuhan [1 ]
Gu, Jiuxiang [1 ]
Zhang, Huajun [1 ]
Wang, Peng [2 ]
Liu, Xiaochuan [2 ]
Zhao, Tianyang [3 ]
机构
[1] Nanyang Technol Univ, Interdisciplinary Grad Sch, Singapore, Singapore
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore, Singapore
[3] Nanyang Technol Univ, Energy Res Inst NTU, Singapore, Singapore
关键词
Microgrid; mobile energy storage; fleet management; deep reinforcement learning; scheduling; resilience; EXTREME;
D O I
10.1109/pesgm41954.2020.9282132
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
Mobile energy storage systems (MESSs) provide mobility and flexibility to enhance distribution system resilience. The paper proposes a Markov decision process (MDP) formulation for an integrated service restoration strategy that coordinates the scheduling of MESSs and resource dispatching of microgrids. The uncertainties in load consumption are taken into account. The deep reinforcement learning (DRL) algorithm is utilized to solve the MDP for optimal scheduling. Specifically, the twin delayed deep deterministic policy gradient (TD3) is applied to train the deep Q-network and policy network, then the well trained policy can be deployed in on-line manner to perform multiple actions simultaneously. The proposed model is demonstrated on an integrated test system with three microgrids connected by Sioux Falls transportation network. The simulation results indicate that mobile and stationary energy resources can be well coordinated to improve system resilience.
引用
收藏
页数:5
相关论文
共 50 条
  • [41] Deep Reinforcement Learning for Task Allocation in Energy Harvesting Mobile Crowdsensing
    Dongare, Sumedh
    Ortiz, Andrea
    Klein, Anja
    [J]. 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 269 - 274
  • [42] Load Forecasting-Based Learning System for Energy Management With Battery Degradation Estimation: A Deep Reinforcement Learning Approach
    Zhang, Hongtao
    Zhang, Guanglin
    Zhao, Mingbo
    Liu, Yuping
    [J]. IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 2342 - 2352
  • [43] Energy-Aware Multi-Server Mobile Edge Computing: A Deep Reinforcement Learning Approach
    Naderializadeh, Navid
    Hashemi, Morteza
    [J]. CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 383 - 387
  • [44] Deep Reinforcement Learning for Resilient Power and Energy Systems: Progress, Prospects, and Future Avenues
    Gautam, Mukesh
    [J]. ELECTRICITY, 2023, 4 (04): : 336 - 380
  • [45] Service migration in mobile edge computing: A deep reinforcement learning approach
    Wang, Hongman
    Li, Yingxue
    Zhou, Ao
    Guo, Yan
    Wang, Shangguang
    [J]. INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, 2023, 36 (01)
  • [46] Mobile parking incentives for vehicular networks: a deep reinforcement learning approach
    Meiyi Yang
    Nianbo Liu
    Lin Zuo
    Haigang Gong
    Minghui Liu
    Ming Liu
    [J]. CCF Transactions on Pervasive Computing and Interaction, 2020, 2 : 261 - 274
  • [47] Mobile parking incentives for vehicular networks: a deep reinforcement learning approach
    Yang, Meiyi
    Liu, Nianbo
    Zuo, Lin
    Gong, Haigang
    Liu, Minghui
    Liu, Ming
    [J]. CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION, 2020, 2 (04) : 261 - 274
  • [48] User Allocation in Mobile Edge Computing: A Deep Reinforcement Learning Approach
    Panda, Subrat Prasad
    Banerjee, Ansuman
    Bhattacharya, Arani
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON WEB SERVICES, ICWS 2021, 2021, : 447 - 458
  • [49] Joint Offloading and Streaming in Mobile Edges: A Deep Reinforcement Learning Approach
    Park, Soohyun
    Kim, Junhui
    Kwon, Dohyun
    Shin, MyungJae
    Kim, Joongheon
    [J]. 2019 IEEE VTS ASIA PACIFIC WIRELESS COMMUNICATIONS SYMPOSIUM (APWCS 2019), 2019,
  • [50] Interpretable Deep Reinforcement Learning for Optimizing Heterogeneous Energy Storage Systems
    Xiong, Luolin
    Tang, Yang
    Liu, Chensheng
    Mao, Shuai
    Meng, Ke
    Dong, Zhaoyang
    Qian, Feng
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2024, 71 (02) : 910 - 921