Q-Learning-Based Model Predictive Control for Energy Management in Residential Aggregator

被引:45
|
作者
Ojand, Kianoosh [1 ]
Dagdougui, Hanane [1 ,2 ]
机构
[1] Polytech Montreal, Dept Math & Ind Engn, Montreal, PQ H3T 1J4, Canada
[2] GERAD Res Ctr, Montreal, PQ H3T 2A7, Canada
关键词
HVAC; State of charge; Buildings; Uncertainty; Real-time systems; Load modeling; Energy management systems; Demand response (DR); distributed energy resources (DERs); electric vehicle (EVs); mixed-integer linear programming (MILP); model predictive control (MPC); reinforcement learning; residential community; thermostatically controlled loads (TCLs); DEMAND RESPONSE; BUILDINGS; STRATEGY; NETWORK;
D O I
10.1109/TASE.2021.3091334
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article presents a demand response scheduling model in a residential community using an energy management system aggregator. The aggregator manages a set of resources, including photovoltaic system, energy storage system, thermostatically controllable loads, and electrical vehicles. The solution aims to dynamically control the power demand and distributed energy resources to improve the matching performance between the renewable power generation and the consumption at the community level while trading electricity in both day-ahead and real-time markets to reduce the operational costs in the aggregator. The problem can be formulated as a mixed-integer linear programming problem in which the objective is to minimize the operation and the degradation costs related to the energy storage system and the electric vehicles batteries. To mitigate the uncertainties associated with system operation, a two-level model predictive control (MPC) integrating Q-learning reinforcement learning model is designed to address different time-scale controllers. MPC algorithm allows making decisions for the day-ahead, based on predictions of uncertain parameters, whereas Q-learning algorithm addresses real-time decisions based on real-time data. The problem is solved for various sets of houses. Results demonstrated that houses can gain more benefits when they are operating in the aggregate mode.
引用
收藏
页码:70 / 81
页数:12
相关论文
共 50 条
  • [1] Q-Learning-Based Model Predictive Control for Nonlinear Continuous-Time Systems
    Zhang, Hao
    Li, Shaoyuan
    Zheng, Yi
    INDUSTRIAL & ENGINEERING CHEMISTRY RESEARCH, 2020, 59 (40) : 17987 - 17999
  • [2] Q-Learning-based model predictive variable impedance control for physical human-robot collaboration
    Roveda, Loris
    Testa, Andrea
    Shahid, Asad Ali
    Braghin, Francesco
    Piga, Dario
    ARTIFICIAL INTELLIGENCE, 2022, 312
  • [3] Q-Learning-based Finite Control Set Model Predictive Control for LCL-Coupled Inverters with Deviated Parameters
    Zhang, Lei
    Peng, Yunjian
    Sun, Weijie
    Li, Jinze
    2023 IEEE 12TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE, DDCLS, 2023, : 949 - 955
  • [4] Energy management in residential microgrid using model predictive control-based reinforcement learning and Shapley value
    Cai, Wenqi
    Kordabad, Arash Bahari
    Gros, Sebastien
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 119
  • [5] Q-Learning-based fuzzy energy management for fuel cell/supercapacitor HEV
    Tao, Jili
    Zhang, Ridong
    Qiao, Zhijun
    Ma, Longhua
    TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2022, 44 (10) : 1939 - 1949
  • [6] Q-Learning-Based Model Predictive Variable Impedance Control for Physical Human-Robot Collaboration (Extended Abstract)
    Roveda, Loris
    Testa, Andrea
    Shahid, Asad Ali
    Braghin, Francesco
    Piga, Dario
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 6959 - 6963
  • [7] Anomaly Detection Networks and Fuzzy Control Modules for Energy Grid Management with Q-Learning-Based Decision Making
    Syu, Jia-Hao
    Lin, Jerry Chun-Wei
    Yu, Philip S.
    PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 397 - 405
  • [8] Q-learning-based H∞ control for LPV systems
    Wang, Hongye
    Wen, Jiwei
    Wan, Haiying
    Xue, Huiwen
    ASIAN JOURNAL OF CONTROL, 2024,
  • [9] Critical Reliability Improvement Using Q-Learning-Based Energy Management System for Microgrids
    Maharjan, Lizon
    Ditsworth, Mark
    Fahimi, Babak
    ENERGIES, 2022, 15 (23)
  • [10] A Q-learning-based Downlink Power Control Algorithm for Energy Efficiency in LTE Femtocells
    Huang, Lianfen
    Wen, Bin
    Gao, Zhibin
    Cai, Hongxiang
    Li, Yujie
    MECHATRONICS ENGINEERING, COMPUTING AND INFORMATION TECHNOLOGY, 2014, 556-562 : 1766 - +