Constrained EV Charging Scheduling Based on Safe Deep Reinforcement Learning

被引:194
|
作者
Li, Hepeng [1 ]
Wan, Zhiqiang [1 ]
He, Haibo [1 ]
机构
[1] Univ Rhode Isl, Dept Elect Comp & Biomed Engn, South Kingstown, RI 02881 USA
基金
美国国家科学基金会;
关键词
Electric vehicle charging; Schedules; Real-time systems; Scheduling; Batteries; Reinforcement learning; Neural networks; Constrained Markov decision process; safe deep reinforcement learning; model-free; EV charging scheduling; ELECTRIC VEHICLES; DEMAND RESPONSE; ENERGY; SMART; MODEL;
D O I
10.1109/TSG.2019.2955437
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Electric vehicles (EVs) have been popularly adopted and deployed over the past few years because they are environment-friendly. When integrated into smart grids, EVs can operate as flexible loads or energy storage devices to participate in demand response (DR). By taking advantage of time-varying electricity prices in DR, the charging cost can be reduced by optimizing the charging/discharging schedules. However, since there exists randomness in the arrival and departure time of an EV and the electricity price, it is difficult to determine the optimal charging/discharging schedules to guarantee that the EV is fully charged upon departure. To address this issue, we formulate the EV charging/discharging scheduling problem as a constrained Markov Decision Process (CMDP). The aim is to find a constrained charging/discharging scheduling strategy to minimize the charging cost as well as guarantee the EV can be fully charged. To solve the CMDP, a model-free approach based on safe deep reinforcement learning (SDRL) is proposed. The proposed approach does not require any domain knowledge about the randomness. It directly learns to generate the constrained optimal charging/discharging schedules with a deep neural network (DNN). Unlike existing reinforcement learning (RL) or deep RL (DRL) paradigms, the proposed approach does not need to manually design a penalty term or tune a penalty coefficient. Numerical experiments with real-world electricity prices demonstrate the effectiveness of the proposed approach.
引用
收藏
页码:2427 / 2439
页数:13
相关论文
共 50 条
  • [41] Beam Hopping Scheduling Based on Deep Reinforcement Learning
    Deng, Huimin
    Ying, Kai
    Gui, Lin
    [J]. 2023 INTERNATIONAL CONFERENCE ON FUTURE COMMUNICATIONS AND NETWORKS, FCN, 2023,
  • [42] DEEP REINFORCEMENT LEARNING-BASED IRRIGATION SCHEDULING
    Yang, Y.
    Hu, J.
    Porter, D.
    Marek, T.
    Heflin, K.
    Kong, H.
    Sun, L.
    [J]. TRANSACTIONS OF THE ASABE, 2020, 63 (03) : 549 - 556
  • [43] A self-sustained EV charging framework with N-step deep reinforcement learning
    Sykiotis, Stavros
    Menos-Aikateriniadis, Christoforos
    Doulamis, Anastasios
    Doulamis, Nikolaos
    Georgilakis, Pavlos S.
    [J]. SUSTAINABLE ENERGY GRIDS & NETWORKS, 2023, 35
  • [44] Electric Vehicle Charging Management Based on Deep Reinforcement Learning
    Li, Sichen
    Hu, Weihao
    Cao, Di
    Dragicevic, Tomislav
    Huang, Qi
    Chen, Zhe
    Blaabjerg, Frede
    [J]. JOURNAL OF MODERN POWER SYSTEMS AND CLEAN ENERGY, 2022, 10 (03) : 719 - 730
  • [45] Deep Reinforcement Learning for EV Charging Navigation by Coordinating Smart Grid and Intelligent Transportation System
    Qian, Tao
    Shao, Chengcheng
    Wang, Xiuli
    Shahidehpour, Mohammad
    [J]. IEEE TRANSACTIONS ON SMART GRID, 2020, 11 (02) : 1714 - 1723
  • [46] Deep Reinforcement Learning Based Data Collection with Charging Stations
    Hao, Fuxin
    Hu, Yifan
    Fu, Junjie
    [J]. 2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 3344 - 3349
  • [47] Multiobjective Battery Charging Strategy Based on Deep Reinforcement Learning
    Xiong, Zheng
    Luo, Biao
    Wang, Bing-Chuan
    Xu, Xiaodong
    Huang, Tingwen
    [J]. IEEE TRANSACTIONS ON TRANSPORTATION ELECTRIFICATION, 2024, 10 (03): : 6893 - 6903
  • [48] Deep Reinforcement Learning-Based Security-Constrained Battery Scheduling in Home Energy System
    Wang, Bo
    Zha, Zhongyi
    Zhang, Lijun
    Liu, Lei
    Fan, Huijin
    [J]. IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 3548 - 3561
  • [49] Electric Vehicle Charging Management Based on Deep Reinforcement Learning
    Sichen Li
    Weihao Hu
    Di Cao
    Tomislav Dragi?evi?
    Qi Huang
    Zhe Chen
    Frede Blaabjerg
    [J]. Journal of Modern Power Systems and Clean Energy, 2022, 10 (03) : 719 - 730
  • [50] A Two-Tailed Pricing Scheme for Optimal EV Charging Scheduling Using Multiobjective Reinforcement Learning
    Adetunji, Kayode E.
    Hofsajer, Ivan W.
    Abu-Mahfouz, Adnan M.
    Cheng, Ling
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (03) : 3361 - 3370