Energy management strategy for fuel cell electric vehicles based on scalable reinforcement learning in novel environment

被引:3
|
作者
Wang, Da [1 ]
Mei, Lei [1 ]
Xiao, Feng [2 ]
Song, Chuanxue [1 ]
Qi, Chunyang [2 ]
Song, Shixin [3 ]
机构
[1] Jilin Univ, Coll Automot Engn, Changchun 130022, Peoples R China
[2] Jilin Univ, State Key Lab Automot Simulat & Control, Changchun 130022, Peoples R China
[3] Jilin Univ, Sch Mech & Aerosp Engn, Changchun 130022, Peoples R China
关键词
Novel and dynamic environments; Energy management strategy; Scalable reinforcement learning; Fuel cell electric vehicle; POWER MANAGEMENT; STORAGE SYSTEM;
D O I
10.1016/j.ijhydene.2024.01.335
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
To optimize the hydrogen consumption, lifespan of the fuel cell (FC), and durability of the lithium -ion battery in fuel cell electric vehicles (FCEVs), energy management strategies (EMSs) have been developed and implemented in the vehicle control units (VCUs) by engineers. Recently, with the rapid development of artificial intelligence, reinforcement learning (RL)-based EMS have shown great promise among various strategies. However, RL-based EMS tends to exhibit relatively poorer performance when faced with novel environments owing to the requirement for learning. This study proposes an enhanced EMS, Scalable Learning in Novel Environment (SLNE)-based EMS. This EMS performs well in unknown and dynamic environments by integrating a memory library (ML) composed of the Dirichlet process (DRP) clustering algorithm combined with the Chinese restaurant process (CRP) using the expectation -maximization (EM) algorithm for updates and the deep Q -network (DQN). Simulation and comparison are conducted under two novel driving cycles to evaluate the performance of the dynamic programming (DP) -based EMS, rule -based EMS, DQN-based EMS, and SLNE-based EMS. By comparing reward, power, and degradation, the results indicate that the proposed SLNE-based EMS demonstrates excellent convergence and adaptability when facing a novel and dynamic environment. Additionally, the proposed SLNEbased EMS achieves approximately 5% improvement in fuel economy, approximately 4.5% reduction in fuel cell degradation rate, and improved durability of the lithium -ion battery, compared with DQN-based EMS.
引用
收藏
页码:668 / 678
页数:11
相关论文
共 50 条
  • [1] Reinforcement Learning based Energy Management for Fuel Cell Hybrid Electric Vehicles
    Guo, Liang
    Li, Zhongliang
    Outbib, Rachid
    [J]. IECON 2021 - 47TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2021,
  • [2] Deep stochastic reinforcement learning-based energy management strategy for fuel cell hybrid electric vehicles
    Jouda, Basel
    Al-Mahasneh, Ahmad Jobran
    Abu Mallouh, Mohammed
    [J]. ENERGY CONVERSION AND MANAGEMENT, 2024, 301
  • [3] Energy Management Strategy of Fuel Cell Vehicles Based on Reinforcement Learning and Traffic Information
    Song, Zhen
    Min, Dehao
    Chen, Huicui
    Pan, Yue
    Zhang, Tong
    [J]. Tongji Daxue Xuebao/Journal of Tongji University, 2021, 49 : 211 - 216
  • [4] A Long-term Energy Management Strategy for Fuel Cell Electric Vehicles Using Reinforcement Learning
    Zhou, Y. F.
    Huang, L. J.
    Sun, X. X.
    Li, L. H.
    Lian, J.
    [J]. FUEL CELLS, 2020, 20 (06) : 753 - 761
  • [5] A collaborative energy management strategy based on multi-agent reinforcement learning for fuel cell hybrid electric vehicles
    Xiao, Yao
    Fu, Shengxiang
    Choi, Jongwoo
    Zheng, Chunhua
    [J]. 2023 IEEE 98TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2023-FALL, 2023,
  • [6] A Deep Reinforcement Learning Based Energy Management Strategy for Hybrid Electric Vehicles in Connected Traffic Environment
    Li, Jie
    Wu, Xiaodong
    Hu, Sunan
    Fan, Jiawei
    [J]. IFAC PAPERSONLINE, 2021, 54 (10): : 150 - 156
  • [7] Energy management strategy of fuel cell electric vehicles
    Sun, Yan
    Xia, Chang-Gao
    Yin, Bi-Feng
    Han, Jiang-Yi
    Gao, Hai-Yu
    Liu, Jing
    [J]. Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2022, 52 (09): : 2130 - 2138
  • [8] A reinforcement learning energy management strategy for fuel cell hybrid electric vehicles considering driving condition classification
    Kang, Xu
    Wang, Yujie
    Chen, Zonghai
    [J]. SUSTAINABLE ENERGY GRIDS & NETWORKS, 2024, 38
  • [9] Online energy management strategy of fuel cell hybrid electric vehicles based on rule learning
    Liu, Yonggang
    Liu, Junjun
    Qin, Datong
    Li, Guang
    Chen, Zheng
    Zhang, Yi
    [J]. JOURNAL OF CLEANER PRODUCTION, 2020, 260
  • [10] Deep Reinforcement Learning Based Energy Management Strategy for Fuel Cell and Battery Powered Rail Vehicles
    Deng, Kai
    Hai, Di
    Peng, Hujun
    Loewenstein, Lars
    Hameyer, Kay
    [J]. 2021 IEEE VEHICLE POWER AND PROPULSION CONFERENCE (VPPC), 2021,