Fusion k-means clustering and multi-head self-attention mechanism for a multivariate time prediction model with feature selection

被引:0
|
作者
Cai, Mingwei [1 ]
Zhan, Jianming [1 ]
Zhang, Chao [2 ]
Liu, Qi [1 ]
机构
[1] Hubei Minzu Univ, Sch Math & Stat, Enshi 445000, Hubei, Peoples R China
[2] Shanxi Univ, Sch Comp & Informat Technol, Key Lab Computat Intelligence & Chinese Informat P, Taiyuan 030006, Shanxi, Peoples R China
关键词
<italic>k</italic>-means clustering; Multi-head self-attention mechanism; Feature Selection; LSTM; TRANSFORMER;
D O I
10.1007/s13042-024-02490-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As the demand for precise predictions grows across various industries due to advancements in sensor technology and computer hardware, multi-feature time series prediction shows significant promise in fields such as information fusion, finance, energy, and meteorology. However, traditional machine learning methods often struggle to forecast future events given the increasing complexity of the data. To address this challenge, the paper introduces an innovative approach that combines an improved k-means clustering with a multi-head self-attention mechanism. This method utilizes long and short-term memory (LSTM) neural networks to filter and identify the most effective feature subset for prediction. In the enhanced k-means clustering algorithm, a novel similarity formula named Feature Vector Similarity (FVS) and a method for automatically determining the number of clustering centers are proposed. This advancement improves the rationality of cluster center selection and enhances overall clustering performance. The multi-head self-attention mechanism calculates the clustering centers and attention weights of objects within the cluster partitions, optimizing feature selection and enhancing computational efficiency. The fusion of k-means clustering, the multi-head self-attention mechanism, and LSTM networks results in a new feature selection method, referred to as KMAL. To further refine the prediction process, we integrate KMAL with LSTM, known for its strong performance in predicting long-term time series, to develop a novel prediction model: KMAL-LSTM. In the subsequent comparative experiments, the prediction performance of the models is assessed using mean absolute error (MAE), mean bias error (MBE), and root mean square error (RMSE). The proposed KMAL-LSTM model consistently exhibits superior validity, stability, and performance when compared to seven other prediction models across six distinct datasets.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network
    Cheng, Zhongjian
    Yan, Cheng
    Wu, Fang-Xiang
    Wang, Jianxin
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2022, 19 (04) : 2208 - 2218
  • [32] PrMFTP: Multi-functional therapeutic peptides prediction based on multi-head self-attention mechanism and class weight optimization
    Yan, Wenhui
    Tang, Wending
    Wang, Lihua
    Bin, Yannan
    Xia, Junfeng
    PLOS COMPUTATIONAL BIOLOGY, 2022, 18 (09)
  • [33] Integrated Multi-Head Self-Attention Transformer model for electricity demand prediction incorporating local climate variables
    Ghimire, Sujan
    Nguyen-Huy, Thong
    AL-Musaylh, Mohanad S.
    Deo, Ravinesh C.
    Casillas-Perez, David
    Salcedo-Sanz, Sancho
    ENERGY AND AI, 2023, 14
  • [34] AttentionSplice: An Interpretable Multi-Head Self-Attention Based Hybrid Deep Learning Model in Splice Site Prediction
    YAN Wenjing
    ZHANG Baoyu
    ZUO Min
    ZHANG Qingchuan
    WANG Hong
    MAO Da
    ChineseJournalofElectronics, 2022, 31 (05) : 870 - 887
  • [35] CPMA: Spatio-Temporal Network Prediction Model Based on Convolutional Parallel Multi-head Self-attention
    Liu, Tiantian
    You, Xin
    Ma, Ming
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, 2024, 14876 : 113 - 124
  • [36] AttentionSplice: An Interpretable Multi-Head Self-Attention Based Hybrid Deep Learning Model in Splice Site Prediction
    Yan Wenjing
    Zhang Baoyu
    Zuo Min
    Zhang Qingchuan
    Wang Hong
    Mao Da
    CHINESE JOURNAL OF ELECTRONICS, 2022, 31 (05) : 870 - 887
  • [37] Reference Crop Evapotranspiration Prediction Based on Gated Recurrent Unit with Quantum Inspired Multi-head Self-attention Mechanism
    Gao, Zehai
    Yang, Dongzhe
    Li, Baojun
    Gao, Zijun
    Li, Chengcheng
    WATER RESOURCES MANAGEMENT, 2025, 39 (03) : 1481 - 1501
  • [38] An interactive multi-head self-attention capsule network model for aspect sentiment classification
    She, Lina
    Gong, Hongfang
    Zhang, Siyu
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (07): : 9327 - 9352
  • [39] An interactive multi-head self-attention capsule network model for aspect sentiment classification
    Lina She
    Hongfang Gong
    Siyu Zhang
    The Journal of Supercomputing, 2024, 80 : 9327 - 9352
  • [40] An integrated multi-head dual sparse self-attention network for remaining useful life prediction
    Zhang, Jiusi
    Li, Xiang
    Tian, Jilun
    Luo, Hao
    Yin, Shen
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 233