Prediction of Battery SOH by CNN-BiLSTM Network Fused with Attention Mechanism

被引:33
|
作者
Sun, Shuo [1 ]
Sun, Junzhong [1 ]
Wang, Zongliang [1 ]
Zhou, Zhiyong [1 ]
Cai, Wei [1 ]
机构
[1] Navy Submarine Acad, Qingdao 266042, Peoples R China
基金
中国国家自然科学基金;
关键词
state of health (SOH); convolutional neural network (CNN); bidirectional long short-term memory network (BiLSTM); attention mechanism (Attention); multi-step prediction; OF-CHARGE ESTIMATION; ION BATTERY; HEALTH ESTIMATION; STATE;
D O I
10.3390/en15124428
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
During the use and management of lead-acid batteries, it is very important to carry out prediction and study of the state of the health (SOH) of the battery. To this end, this paper proposes a SOH prediction method for lead-acid batteries based on the CNN-BiLSTM-Attention model. The model utilizes the convolutional neural network (CNN) to carry out feature extraction and data dimension reduction in the input factors of model, and then these factors are used as the input of the bidirectional long short-term memory network (BiLSTM). The BiLSTM is used to learn the temporal correlation information in the local features of input time series bidirectionally. The attention mechanism is introduced to assign more attention to key features in the input sequence with more significant influence on the output result by assigning weights to important features, and finally, multi-step prediction of the battery SOH is realized. Compared with the prediction results of battery SOH using other neural network methods, the method proposed in this study can provide higher prediction accuracy and achieve accurate multi-step prediction of battery SOH. Measured results show that most of the multi-step prediction errors of the proposed method are controlled within 3%.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] A CNN-BiLSTM model with attention mechanism for earthquake prediction
    Kavianpour, Parisa
    Kavianpour, Mohammadreza
    Jahani, Ehsan
    Ramezani, Amin
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (17): : 19194 - 19226
  • [2] A CNN-BiLSTM model with attention mechanism for earthquake prediction
    Parisa Kavianpour
    Mohammadreza Kavianpour
    Ehsan Jahani
    Amin Ramezani
    The Journal of Supercomputing, 2023, 79 : 19194 - 19226
  • [3] Correction to: A CNN-BiLSTM model with attention mechanism for earthquake prediction
    Parisa Kavianpour
    Mohammadreza Kavianpour
    Ehsan Jahani
    Amin Ramezani
    The Journal of Supercomputing, 2024, 80 : 2913 - 2913
  • [4] A Prediction Method of Consumer Buying Behavior Based on Attention Mechanism and CNN-BiLSTM
    Wang, Jian-Nan
    Cui, Jian-Feng
    Chen, Chin-Ling
    Journal of Network Intelligence, 2022, 7 (02): : 375 - 385
  • [5] CNN-BiLSTM hybrid neural networks with attention mechanism for well log prediction
    Shan, Liqun
    Liu, Yanchang
    Tang, Min
    Yang, Ming
    Bai, Xueyuan
    JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING, 2021, 205
  • [6] PM2.5 Concentration Prediction Based on CNN-BiLSTM and Attention Mechanism
    Zhang, Jinsong
    Peng, Yongtao
    Ren, Bo
    Li, Taoying
    ALGORITHMS, 2021, 14 (07)
  • [7] Life Prediction for Machinery Components Based on CNN-BiLSTM Network and Attention Model
    Wang, Mengyong
    Cheng, Jian
    Zhai, Hongyu
    PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020), 2020, : 851 - 855
  • [8] Remaining Useful Life Prediction of Milling Cutters Based on CNN-BiLSTM and Attention Mechanism
    Nie, Lei
    Zhang, Lvfan
    Xu, Shiyi
    Cai, Wentao
    Yang, Haoming
    SYMMETRY-BASEL, 2022, 14 (11):
  • [9] Attention-based CNN-BiLSTM for SOH and RUL estimation of lithium-ion batteries
    Zhu, Zhenyu
    Yang, Qing
    Liu, Xin
    Gao, Dexin
    JOURNAL OF ALGORITHMS & COMPUTATIONAL TECHNOLOGY, 2022, 16
  • [10] An Improved Facial Expression Recognition using CNN-BiLSTM with Attention Mechanism
    Jayaraman, Samanthisvaran
    Mahendran, Anand
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (05) : 1307 - 1315