Optimal real-time scheduling of battery operation using reinforcement learning

被引:3
|
作者
Juarez, Carolina Quiroz [1 ]
Musilek, Petr [1 ]
机构
[1] Univ Alberta, Elect & Comp Engn, Edmonton, AB, Canada
关键词
Battery energy storage system; adaptive control; neural network; Q-learning; load-shifting operational strategy; RESIDENTIAL SOLAR; SELF-CONSUMPTION; ENERGY-STORAGE; PHOTOVOLTAIC SYSTEMS; MANAGEMENT; BUILDINGS;
D O I
10.1109/CCECE53047.2021.9569124
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Adoption of battery energy storage systems working with solar photovoltaic distributed systems for residential household applications strongly depends on their return on investment. Battery energy storage system (BESS) technology costs have been strongly decreasing during the last decade. However, such a tendency has to be supported by optimal BESS real-time operation strategies that adapt to the stochastic operation conditions (residential load, solar generation, and electricity prices) and minimize the customer's electric bill. This work presents a real-time adaptive BESS controller that implements a load-shifting strategy under time-of-use and feed-in-tariff (microFlT) regulatory incentives. The optimization of the battery operating strategy is carried out by a Q-learning algorithm and later encoded in a neural network that implements the optimal strategy at a fraction of the computation cost. Real residential demand and solar generation profiles during the summer and winter seasons in Edmonton, Canada, are utilized to train and test the controller. Two battery technologies, lithium-ion and vanadium redox flow, are simulated; real charge-discharge experimental data from an installed system was used. The proposed adaptive controller outperforms the optimal strategy, both during the summer and winter testing periods.
引用
收藏
页数:7
相关论文
共 50 条
  • [11] Real-Time IDS Using Reinforcement Learning
    Sagha, Hesam
    Shouraki, Saeed Bagheri
    Khasteh, Hosein
    Dehghani, Mahdi
    [J]. 2008 INTERNATIONAL SYMPOSIUM ON INTELLIGENT INFORMATION TECHNOLOGY APPLICATION, VOL II, PROCEEDINGS, 2008, : 593 - +
  • [12] Real-time optimization using reinforcement learning
    Powell, By Kody M.
    Machalek, Derek
    Quah, Titus
    [J]. COMPUTERS & CHEMICAL ENGINEERING, 2020, 143 (143)
  • [13] Real-Time Reinforcement Learning
    Ramstedt, Simon
    Pal, Christopher
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [14] Hybrid DVFS Scheduling for Real-Time Systems Based on Reinforcement Learning
    Muhammad, Fakhruddin
    ul Islam, Mahbub
    Lin, Man
    [J]. IEEE SYSTEMS JOURNAL, 2017, 11 (02): : 931 - 940
  • [15] Hierarchical Reinforcement-Learning for Real-Time Scheduling of Agile Satellites
    Ren, Lili
    Ning, Xin
    Li, Jiayin
    [J]. IEEE ACCESS, 2020, 8 (08): : 220523 - 220532
  • [16] Real-time power scheduling through reinforcement learning from demonstrations
    Liu, Shaohuai
    Liu, Jinbo
    Yang, Nan
    Huang, Yupeng
    Jiang, Qirong
    Gao, Yang
    [J]. ELECTRIC POWER SYSTEMS RESEARCH, 2024, 235
  • [17] Distributed Real-Time Scheduling in Cloud Manufacturing by Deep Reinforcement Learning
    Zhang, Lixiang
    Yang, Chen
    Yan, Yan
    Hu, Yaoguang
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (12) : 8999 - 9007
  • [18] A Real-Time and Optimal Hypersonic Entry Guidance Method Using Inverse Reinforcement Learning
    Su, Linfeng
    Wang, Jinbo
    Chen, Hongbo
    Pezzella, Giuseppe
    [J]. AEROSPACE, 2023, 10 (11)
  • [19] Real-Time Scheduling of Operational Time for Smart Home Appliances Based on Reinforcement Learning
    Khan, Murad
    Seo, Junho
    Kim, Dongkyun
    [J]. IEEE ACCESS, 2020, 8 : 116520 - 116534
  • [20] Real-Time Optimal Energy Management of Electrified Powertrains with Reinforcement Learning
    Biswas, Atriya
    Anselma, Pier G.
    Emadi, Ali
    [J]. 2019 IEEE TRANSPORTATION ELECTRIFICATION CONFERENCE AND EXPO (ITEC), 2019,