Emergency load shedding strategy for high renewable energy penetrated power systems based on deep reinforcement learning

被引:6
|
作者
Chen, Hongwei [1 ]
Zhuang, Junzhi [1 ]
Zhou, Gang [1 ]
Wang, Yuwei [1 ]
Sun, Zhenglong [1 ]
Levron, Yoash [2 ]
机构
[1] Northeast Elect Power Univ, Minist Educ, Key Lab Modern Power Syst Simulat & Control & Ren, Jilin 132012, Jilin, Peoples R China
[2] Technion Israel Inst Technol, Andrew & Erna Viterbi Fac Elect Engn, IL-3200003 Haifa, Israel
关键词
Emergency load shedding; Mismatch scenario; Deep reinforcement learning; Design of decision space;
D O I
10.1016/j.egyr.2023.03.027
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
Traditional event-driven emergency load shedding determines the quantitative strategy by simulation of a specific set of expected faults, which requires high model accuracy and operation mode matching. However, due to the model complexity of renewable power generators and fluctuating power generation, traditional event-driven load shedding strategy faces the risk of mismatching in high renewable energy penetrated power systems. To address these challenges, this paper proposes an emergency load shedding method based on data-driven strategies and deep reinforcement learning (RL). Firstly, the reason for the possible mismatch of the event-driven load shedding strategy in the renewable power system is analyzed, and a typical mismatch scenario is constructed. Then, the emergency load shedding strategy is transformed into a Markov Decision Process (MDP), and the decision process's action space, state space, and reward function are designed. On this basis, an emergency control strategy platform based on the Gym framework is established for application of deep reinforcement learning in the power system emergency control strategy. In order to enhance the adaptability and efficiency of the RL intelligence agent to multi-fault scenarios, the Proximal Policy Optimization (PPO) is adopted to optimize the constructed MDP. Finally, the proposed reinforcement learning-based emergency load shedding strategy is trained and verified through a modified IEEE 39-bus system. The results show that the proposed strategy can effectively make correct strategies to restore system frequency in the event-driven load shedding mismatch scenario, and have good adaptability for different faults and operation scenarios. (c) 2023 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页码:434 / 443
页数:10
相关论文
共 50 条
  • [1] Emergency load shedding strategy for high renewable energy penetrated power systems based on deep reinforcement learning
    Chen, Hongwei
    Zhuang, Junzhi
    Zhou, Gang
    Wang, Yuwei
    Sun, Zhenglong
    Levron, Yoash
    [J]. ENERGY REPORTS, 2023, 9 : 434 - 443
  • [2] Load Shedding Control Strategy in Power Grid Emergency State Based on Deep Reinforcement Learning
    Li, Jian
    Chen, Sheng
    Wang, Xinying
    Pu, Tianjiao
    [J]. CSEE JOURNAL OF POWER AND ENERGY SYSTEMS, 2022, 8 (04): : 1175 - 1182
  • [3] Safe Reinforcement Learning for Emergency Load Shedding of Power Systems
    Thanh Long Vu
    Mukherjee, Sayak
    Yin, Tim
    Huang, Renke
    Tan, Jie
    Huang, Qiuhua
    [J]. 2021 IEEE POWER & ENERGY SOCIETY GENERAL MEETING (PESGM), 2021,
  • [4] An emergency control strategy for undervoltage load shedding of power system: A graph deep reinforcement learning method
    Pei, Yangzhou
    Yang, Jun
    Wang, Jundong
    Xu, Peidong
    Zhou, Ting
    Wu, Fuzhang
    [J]. IET GENERATION TRANSMISSION & DISTRIBUTION, 2023, 17 (09) : 2130 - 2141
  • [5] Deep Reinforcement Learning-Based Active Network Management and Emergency Load-Shedding Control for Power Systems
    Zhang, Haotian
    Sun, Xinfeng
    Lee, Myoung Hoon
    Moon, Jun
    [J]. IEEE TRANSACTIONS ON SMART GRID, 2024, 15 (02) : 1423 - 1437
  • [6] Deep Reinforcement Learning Generation Scheduling Scheme for High-Renewable Penetrated Power System
    Guo, Lei
    Guo, Jun
    Zhang, Yong
    Guo, Wanshu
    Zhou, Xinsheng
    Wang, Lipeng
    [J]. 2022 IEEE/IAS INDUSTRIAL AND COMMERCIAL POWER SYSTEM ASIA (I&CPS ASIA 2022), 2022, : 682 - 686
  • [7] Operation strategy optimization of combined cooling, heating, and power systems with energy storage and renewable energy based on deep reinforcement learning
    Ruan, Yingjun
    Liang, Zhenyu
    Qian, Fanyue
    Meng, Hua
    Gao, Yuan
    [J]. JOURNAL OF BUILDING ENGINEERING, 2023, 65
  • [8] Load Frequency Control in Renewable Energy Penetrated Hybrid Power Systems
    Koley, Indrajit
    Datta, Asim
    Panda, Goutam Kumar
    [J]. JOURNAL OF NEW MATERIALS FOR ELECTROCHEMICAL SYSTEMS, 2023, 26 (04) : 268 - 276
  • [9] An adaptive active power rolling dispatch strategy for high proportion of renewable energy based on distributed deep reinforcement learning
    Bai, Yuyang
    Chen, Siyuan
    Zhang, Jun
    Xu, Jian
    Gao, Tianlu
    Wang, Xiaohui
    Gao, David Wenzhong
    [J]. APPLIED ENERGY, 2023, 330
  • [10] An adaptive active power rolling dispatch strategy for high proportion of renewable energy based on distributed deep reinforcement learning
    Bai, Yuyang
    Chen, Siyuan
    Zhang, Jun
    Xu, Jian
    Gao, Tianlu
    Wang, Xiaohui
    Wenzhong Gao, David
    [J]. Applied Energy, 2023, 330