Approximate Cost-Optimal Energy Management of Hydrogen Electric Multiple Unit Trains Using Double Q-Learning Algorithm

被引:74
|
作者
Li, Qi [1 ]
Meng, Xiang [1 ]
Gao, Fei [3 ,4 ]
Zhang, Guorui [2 ]
Chen, Weirong [1 ]
机构
[1] Southwest Jiaotong Univ, Dept Elect Engn, Chengdu 611756, Sichuan, Peoples R China
[2] CRRC Qingdao Sifang Co Ltd, Qingdao 266111, Shandong, Peoples R China
[3] Univ Bourgogne Franche Comte, FEMTO ST Inst, Rue Ernest Thierry Mieg, F-90010 Belfort, France
[4] Univ Bourgogne Franche Comte, FCLAB, UTBM, CNRS, Rue Ernest Thierry Mieg, F-90010 Belfort, France
关键词
Fuel cells; Energy management; Optimization; Hydrogen; Resistance; Batteries; Hybrid power systems; fuel cell; hydrogen; rail transportation; POWER MANAGEMENT; HYBRID; STRATEGY; OPTIMIZATION; OPERATION; BATTERY; CONSUMPTION; VEHICLE; SYSTEM;
D O I
10.1109/TIE.2021.3113021
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Energy management strategy (EMS) is the key to the performance of fuel cell / battery hybrid system. At present, reinforcement learning (RL) has been introduced into this field and has gradually become the focus of research. However, traditional EMSs only take the energy consumption into consideration when optimizing the operation economy, and ignore the cost caused by power source degradations. It would cause the problem of poor operation economy regarding Total Cost of Ownership (TCO). On the other hand, most studied RL algorithms have the disadvantages of overestimation and improper way of restricting battery SOC, which would lead to relatively poor control performance as well. To solve these problems, this paper establishes a TCO model including energy consumption, equivalent energy consumption and degradation of power sources at first, then adopt the Double Q-learning RL algorithm with state constraint and variable action space to determine the optimal EMS. Finally, using hardware-in-the-loop platform, the feasibility, superiority and generalization of proposed EMS is proved by comparing with the optimal dynamic programming and traditional RL EMS and equivalent consumption minimum strategy (ECMS) under both training and unknown operating conditions. Results prove that the proposed strategy has high global optimality and excellent SOC control ability regardless of training or unknown conditions.
引用
收藏
页码:9099 / 9110
页数:12
相关论文
共 50 条
  • [31] Design and Testing of a Demand Response Q-Learning Algorithm for a Smart Home Energy Management System
    Angano, Walter
    Musau, Peter
    Wekesa, Cyrus Wabuge
    2021 IEEE PES/IAS POWERAFRICA CONFERENCE, 2021, : 328 - 332
  • [32] Optimal Resource Allocation for GAA Users in Spectrum Access System Using Q-Learning Algorithm
    Abbass, Waseem
    Hussain, Riaz
    Frnda, Jaroslav
    Khan, Irfan Latif
    Javed, Muhammad Awais
    Malik, Shahzad A.
    IEEE ACCESS, 2022, 10 : 60790 - 60804
  • [33] Enhanced Q-learning for real-time hybrid electric vehicle energy management with deterministic rule
    Li, Yang
    Tao, Jili
    Xie, Liang
    Zhang, Ridong
    Ma, Longhua
    Qiao, Zhijun
    MEASUREMENT & CONTROL, 2020, 53 (7-8): : 1493 - 1503
  • [34] Training-efficient and cost-optimal energy management for fuel cell hybrid electric bus based on a novel distributed deep reinforcement learning framework
    Huang, Ruchen
    He, Hongwen
    Gao, Miaojue
    APPLIED ENERGY, 2023, 346
  • [35] A Fast Q-learning Energy Management Strategy for Battery/Supercapacitor Electric Vehicles Considering Energy Saving and Battery Aging
    Ye, Yiming
    Zhang, Jiangfeng
    Xu, Bin
    INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER AND ENERGY TECHNOLOGIES (ICECET 2021), 2021, : 1639 - 1644
  • [36] Renewable Energy Bidding Strategies Using Multiagent Q-Learning in Double-Sided Auctions
    Chiu, Wei-Yu
    Hu, Chan-Wei
    Chiu, Kun-Yen
    IEEE SYSTEMS JOURNAL, 2022, 16 (01): : 985 - 996
  • [37] Optimal sizing of energy storage system for hydrogen-electric intercity trains based on life cycle cost analysis
    Peng, Yang
    Lu, Shaofeng
    Huang, Yaoming
    Wu, Chaoxian
    Zhang, Bolun
    Lin, Zhenhong
    Gao, Hongguang
    JOURNAL OF ENERGY STORAGE, 2025, 107
  • [38] Energy consumption and battery aging minimization using a Q-learning strategy for a battery/ultracapacitor electric vehicle
    Xu, Bin
    Shi, Junzhe
    Li, Sixu
    Li, Huayi
    Wang, Zhe
    ENERGY, 2021, 229
  • [39] Optimal Power Management Based on Q-Learning and Neuro-Dynamic Programming for Plug-in Hybrid Electric Vehicles
    Liu, Chang
    Murphey, Yi Lu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (06) : 1942 - 1954
  • [40] Multiagent-based secure energy management for multimedia grid communication using Q-learning
    Aparna Kumari
    Sudeep Tanwar
    Multimedia Tools and Applications, 2022, 81 : 36645 - 36665