Battery Scheduling Control of a Microgrid Trading with Utility Grid Using Deep Reinforcement Learning

被引:2
|
作者
Mohamed, Mahmoud [1 ]
Tsuji, Takao [1 ]
机构
[1] Yokohama Natl Univ, Grad Sch Engn Sci, Div Phys Elect & Comp Engn, 79-1 Tokiwadai,Hodogaya Ku, Yokohama, Kanagawa 2408501, Japan
关键词
deep reinforcement learning; battery energy storage systems; energy trading; microgrid; solar power; ENERGY; MANAGEMENT;
D O I
10.1002/tee.23768
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Managing microgrids (MGs) with variable renewable energy (VRE) is challenging because of uncertainties of electricity production, loads, and energy price, so we need flexible control strategies for battery energy storage system (BESS) to handle those challenges. Model-based approaches require precise models of the MG to give accurate results but having an accurate model can be difficult in continually changing environments. We introduced a new day-ahead optimization method to control BESS scheduling and power exchange between the utility grid and an MG with a load, photovoltaics, and BESS with the aim of energy cost minimization. Deep reinforcement learning (DRL) was used for the optimization of sequential actions of BESS over a time horizon. A theoretical optimum scheduling was derived using a linear programming optimization to be compared with the DRL agent. Both no-battery and greedy control algorithms were used as baselines. It was shown that the results of the proposed technique were better than the baselines through numerical simulations using whole year's data.
引用
收藏
页码:665 / 677
页数:13
相关论文
共 50 条
  • [31] Lifelong control of off-grid microgrid with model-based reinforcement learning
    Totaro, Simone
    Boukas, Ioannis
    Jonsson, Anders
    Cornelusse, Bertrand
    ENERGY, 2021, 232
  • [32] Deep Reinforcement Learning Based Approach for Optimal Power Flow of Microgrid with Grid Services Implementation
    Nie, Jingping
    Liu, Yanchen
    Zhou, Liwei
    Jiang, Xiaofan
    Preindl, Matthias
    2022 IEEE/AIAA TRANSPORTATION ELECTRIFICATION CONFERENCE AND ELECTRIC AIRCRAFT TECHNOLOGIES SYMPOSIUM (ITEC+EATS 2022), 2022, : 1148 - 1153
  • [33] Distributed Optimal Power Scheduling for Microgrid System via Deep Reinforcement Learning with Privacy Preserving
    He, Tong
    Wu, Xiang
    Dong, Hui
    Guo, Fanghong
    Yu, Wei
    2022 IEEE 17TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION, ICCA, 2022, : 820 - 825
  • [34] Real-time optimal scheduling for microgrid systems based on distributed deep reinforcement learning
    Guo F.-H.
    He T.
    Wu X.
    Dong H.
    Liu B.
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2022, 39 (10): : 1881 - 1889
  • [35] Algorithmic trading using continuous action space deep reinforcement learning
    Majidi, Naseh
    Shamsi, Mahdi
    Marvasti, Farokh
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 235
  • [36] Microgrid economic dispatch using Information-Enhanced Deep Reinforcement Learning with consideration of control periods
    Cheng, Liu Weng
    Zhong, Mao Zhi
    ELECTRIC POWER SYSTEMS RESEARCH, 2025, 239
  • [37] Algorithmic trading using combinational rule vector and deep reinforcement learning
    Huang, Zhen
    Li, Ning
    Mei, Wenliang
    Gong, Wenyong
    APPLIED SOFT COMPUTING, 2023, 147
  • [38] Community Microgrid Energy Co-Scheduling Based on Deep Reinforcement Learning and Contribution Mechanisms
    Xiong, Kang
    Wei, Qinglai
    Liu, Yu
    IEEE TRANSACTIONS ON SMART GRID, 2025, 16 (02) : 1051 - 1061
  • [39] Intelligent Demand Response Resource Trading Using Deep Reinforcement Learning
    Zhang, Yufan
    Ai, Qian
    Li, Zhaoyu
    CSEE JOURNAL OF POWER AND ENERGY SYSTEMS, 2024, 10 (06): : 2621 - 2630
  • [40] Empirical Analysis of Automated Stock Trading Using Deep Reinforcement Learning
    Kong, Minseok
    So, Jungmin
    APPLIED SCIENCES-BASEL, 2023, 13 (01):