Fusion k-means clustering and multi-head self-attention mechanism for a multivariate time prediction model with feature selection

被引:0
|
作者
Cai, Mingwei [1 ]
Zhan, Jianming [1 ]
Zhang, Chao [2 ]
Liu, Qi [1 ]
机构
[1] Hubei Minzu Univ, Sch Math & Stat, Enshi 445000, Hubei, Peoples R China
[2] Shanxi Univ, Sch Comp & Informat Technol, Key Lab Computat Intelligence & Chinese Informat P, Taiyuan 030006, Shanxi, Peoples R China
关键词
<italic>k</italic>-means clustering; Multi-head self-attention mechanism; Feature Selection; LSTM; TRANSFORMER;
D O I
10.1007/s13042-024-02490-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As the demand for precise predictions grows across various industries due to advancements in sensor technology and computer hardware, multi-feature time series prediction shows significant promise in fields such as information fusion, finance, energy, and meteorology. However, traditional machine learning methods often struggle to forecast future events given the increasing complexity of the data. To address this challenge, the paper introduces an innovative approach that combines an improved k-means clustering with a multi-head self-attention mechanism. This method utilizes long and short-term memory (LSTM) neural networks to filter and identify the most effective feature subset for prediction. In the enhanced k-means clustering algorithm, a novel similarity formula named Feature Vector Similarity (FVS) and a method for automatically determining the number of clustering centers are proposed. This advancement improves the rationality of cluster center selection and enhances overall clustering performance. The multi-head self-attention mechanism calculates the clustering centers and attention weights of objects within the cluster partitions, optimizing feature selection and enhancing computational efficiency. The fusion of k-means clustering, the multi-head self-attention mechanism, and LSTM networks results in a new feature selection method, referred to as KMAL. To further refine the prediction process, we integrate KMAL with LSTM, known for its strong performance in predicting long-term time series, to develop a novel prediction model: KMAL-LSTM. In the subsequent comparative experiments, the prediction performance of the models is assessed using mean absolute error (MAE), mean bias error (MBE), and root mean square error (RMSE). The proposed KMAL-LSTM model consistently exhibits superior validity, stability, and performance when compared to seven other prediction models across six distinct datasets.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Multi-modal feature fusion with multi-head self-attention for epileptic EEG signals
    Huang, Ning
    Xi, Zhengtao
    Jiao, Yingying
    Zhang, Yudong
    Jiao, Zhuqing
    Li, Xiaona
    Mathematical Biosciences and Engineering, 2024, 21 (08) : 6918 - 6935
  • [2] Modality attention fusion model with hybrid multi-head self-attention for video understanding
    Zhuang, Xuqiang
    Liu, Fang'al
    Hou, Jian
    Hao, Jianhua
    Cai, Xiaohong
    PLOS ONE, 2022, 17 (10):
  • [3] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [4] Multi-head self-attention mechanism-based global feature learning model for ASD diagnosis
    Zhao, Feng
    Feng, Fan
    Ye, Shixin
    Mao, Yanyan
    Chen, Xiaobo
    Li, Yuan
    Ning, Mao
    Zhang, Mingli
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 91
  • [5] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119
  • [6] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    PLOS ONE, 2024, 19 (06):
  • [7] Microblog Sentiment Analysis with Multi-Head Self-Attention Pooling and Multi-Granularity Feature Interaction Fusion
    Yan S.
    Wang J.
    Liu X.
    Cui Y.
    Tao Z.
    Zhang X.
    Data Analysis and Knowledge Discovery, 2023, 7 (04) : 32 - 45
  • [8] LSTM-MH-SA landslide displacement prediction model based on multi-head self-attention mechanism
    Zhang, Zhen-kung
    Zhang, Dong-mei
    Li, Jiang
    Wu, Yi-ping
    ROCK AND SOIL MECHANICS, 2022, 43 : 477 - +
  • [9] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [10] A Model for Sea Ice Segmentation based on Feature Pyramid Network and Multi-head Self-attention
    Xu, Yuanxiang
    Feng, Yuan
    Song, Shengyu
    Liu, Jiahao
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 97 - 102