Deep reinforcement learning for energy management in a microgrid with flexible demand

被引:113
|
作者
Nakabi, Taha Abdelhalim [1 ]
Toivanen, Pekka [1 ]
机构
[1] Univ Eastern Finland, Sch Comp, Kuopio Campus,POB 1627, Kuopio 70211, Finland
来源
关键词
Artificial intelligence; Deep reinforcement learning; Demand Response; Dynamic pricing; Energy management system; Microgrid; Neural networks; Price-responsive loads; Smart grid; Thermostatically controlled loads; MODEL-PREDICTIVE CONTROL; LOAD; STORAGE; COORDINATION; ALGORITHMS; GENERATION; OPERATION;
D O I
10.1016/j.segan.2020.100413
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
In this paper, we study the performance of various deep reinforcement learning algorithms to enhance the energy management system of a microgrid. We propose a novel microgrid model that consists of a wind turbine generator, an energy storage system, a set of thermostatically controlled loads, a set of price-responsive loads, and a connection to the main grid. The proposed energy management system is designed to coordinate among the different flexible sources by defining the priority resources, direct demand control signals, and electricity prices. Seven deep reinforcement learning algorithms were implemented and are empirically compared in this paper. The numerical results show that the deep reinforcement learning algorithms differ widely in their ability to converge to optimal policies. By adding an experience replay and a semi-deterministic training phase to the well-known asynchronous advantage actor-critic algorithm, we achieved the highest model performance as well as convergence to near-optimal policies. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Deep Reinforcement Learning Microgrid Optimization Strategy Considering Priority Flexible Demand Side
    Sang, Jinsong
    Sun, Hongbin
    Kou, Lei
    [J]. SENSORS, 2022, 22 (06)
  • [2] Reinforcement learning for microgrid energy management
    Kuznetsova, Elizaveta
    Li, Yan-Fu
    Ruiz, Carlos
    Zio, Enrico
    Ault, Graham
    Bell, Keith
    [J]. ENERGY, 2013, 59 : 133 - 146
  • [3] Online Microgrid Energy Management Based on Safe Deep Reinforcement Learning
    Li, Hepeng
    Wang, Zhenhua
    Li, Lusi
    He, Haibo
    [J]. 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [4] Energy Management System by Deep Reinforcement Learning Approach in a Building Microgrid
    Dini, Mohsen
    Ossart, Florence
    [J]. ELECTRIMACS 2022, VOL 2, 2024, 1164 : 257 - 269
  • [5] Energy Optimization Management of Multi-microgrid using Deep Reinforcement Learning
    Zhang, Tingjun
    Yue, Dong
    Zhao, Nan
    [J]. 2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 4049 - 4053
  • [6] Real-Time Energy Management of a Microgrid Using Deep Reinforcement Learning
    Ji, Ying
    Wang, Jianhui
    Xu, Jiacan
    Fang, Xiaoke
    Zhang, Huaguang
    [J]. ENERGIES, 2019, 12 (12)
  • [7] Microgrid energy management using deep Q-network reinforcement learning
    Alabdullah, Mohammed H.
    Abido, Mohammad A.
    [J]. ALEXANDRIA ENGINEERING JOURNAL, 2022, 61 (11) : 9069 - 9078
  • [8] Novel Architecture of Energy Management Systems Based on Deep Reinforcement Learning in Microgrid
    Lee, Seongwoo
    Seon, Joonho
    Sun, Young Ghyu
    Kim, Soo Hyun
    Kyeong, Chanuk
    Kim, Dong In
    Kim, Jin Young
    [J]. IEEE TRANSACTIONS ON SMART GRID, 2024, 15 (02) : 1646 - 1658
  • [9] Multiagent Bayesian Deep Reinforcement Learning for Microgrid Energy Management Under Communication Failures
    Zhou, Hao
    Aral, Atakan
    Brandic, Ivona
    Erol-Kantarci, Melike
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 9 (14): : 11685 - 11698
  • [10] Reward Mechanism Design for Deep Reinforcement Learning-Based Microgrid Energy Management
    Hu, Mingjie
    Han, Baohui
    Lv, Shilin
    Bao, Zhejing
    Lu, Lingxia
    Yu, Miao
    [J]. 2023 6TH INTERNATIONAL CONFERENCE ON RENEWABLE ENERGY AND POWER ENGINEERING, REPE 2023, 2023, : 201 - 205