Battery Scheduling Control of a Microgrid Trading with Utility Grid Using Deep Reinforcement Learning

被引:2
|
作者
Mohamed, Mahmoud [1 ]
Tsuji, Takao [1 ]
机构
[1] Yokohama Natl Univ, Grad Sch Engn Sci, Div Phys Elect & Comp Engn, 79-1 Tokiwadai,Hodogaya Ku, Yokohama, Kanagawa 2408501, Japan
关键词
deep reinforcement learning; battery energy storage systems; energy trading; microgrid; solar power; ENERGY; MANAGEMENT;
D O I
10.1002/tee.23768
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Managing microgrids (MGs) with variable renewable energy (VRE) is challenging because of uncertainties of electricity production, loads, and energy price, so we need flexible control strategies for battery energy storage system (BESS) to handle those challenges. Model-based approaches require precise models of the MG to give accurate results but having an accurate model can be difficult in continually changing environments. We introduced a new day-ahead optimization method to control BESS scheduling and power exchange between the utility grid and an MG with a load, photovoltaics, and BESS with the aim of energy cost minimization. Deep reinforcement learning (DRL) was used for the optimization of sequential actions of BESS over a time horizon. A theoretical optimum scheduling was derived using a linear programming optimization to be compared with the DRL agent. Both no-battery and greedy control algorithms were used as baselines. It was shown that the results of the proposed technique were better than the baselines through numerical simulations using whole year's data.
引用
收藏
页码:665 / 677
页数:13
相关论文
共 50 条
  • [1] A Priority Scheduling Strategy of a Microgrid Using a Deep Reinforcement Learning Method
    Dong, Lun
    Huang, Yuan
    Xu, Xiao
    Zhang, Zhenyuan
    Liu, Junyong
    Pan, Li
    Hu, Weihao
    2023 IEEE/IAS INDUSTRIAL AND COMMERCIAL POWER SYSTEM ASIA, I&CPS ASIA, 2023, : 1490 - 1496
  • [2] Online optimal scheduling of a microgrid based on deep reinforcement learning
    Ji, Ying
    Wang, Jian-Hui
    Kongzhi yu Juece/Control and Decision, 2022, 37 (07): : 1675 - 1684
  • [3] Operation of Distributed Battery Considering Demand Response Using Deep Reinforcement Learning in Grid Edge Control
    Li, Wenying
    Tang, Ming
    Zhang, Xinzhen
    Gao, Danhui
    Wang, Jian
    ENERGIES, 2021, 14 (22)
  • [4] Multi-agent Deep Reinforcement Learning for Microgrid Energy Scheduling
    Zuo, Zhiqiang
    Li, Zhi
    Wang, Yijing
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 6184 - 6189
  • [5] Training A Deep Reinforcement Learning Agent for Microgrid Control using PSCAD Environment
    Soofi, Arash Farokhi
    Bayani, Reza
    Yazdanibiouki, Mehrdad
    Manshadi, Saeed D.
    2023 IEEE PES GRID EDGE TECHNOLOGIES CONFERENCE & EXPOSITION, GRID EDGE, 2023,
  • [6] Renewable energy integration and microgrid energy trading using multi-agent deep reinforcement learning
    Harrold, Daniel J. B.
    Cao, Jun
    Fan, Zhong
    APPLIED ENERGY, 2022, 318
  • [7] Peer-to-peer trading in smart grid with demand response and grid outage using deep reinforcement learning
    Alsolami, Mohammed
    Alferidi, Ahmad
    Lami, Badr
    Ben Slama, Sami
    AIN SHAMS ENGINEERING JOURNAL, 2023, 14 (12)
  • [8] Battery Energy Management in a Microgrid Using Batch Reinforcement Learning
    Mbuwir, Brida V.
    Ruelens, Frederik
    Spiessens, Fred
    Deconinck, Geert
    ENERGIES, 2017, 10 (11):
  • [9] Cryptocurrency Trading Agent Using Deep Reinforcement Learning
    Suliman, Uwais
    van Zyl, Terence L.
    Paskaramoorthy, Andrew
    2022 9TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE, ISCMI, 2022, : 6 - 10
  • [10] Designing an optimal microgrid control system using deep reinforcement learning: A systematic review
    Dinata, Noer Fadzri Perdana
    Ramli, Makbul Anwari Muhammad
    Jambak, Muhammad Irfan
    Sidik, Muhammad Abu Bakar
    Alqahtani, Mohammed M.
    ENGINEERING SCIENCE AND TECHNOLOGY-AN INTERNATIONAL JOURNAL-JESTECH, 2024, 51