Fusion k-means clustering and multi-head self-attention mechanism for a multivariate time prediction model with feature selection

被引:0
|
作者
Cai, Mingwei [1 ]
Zhan, Jianming [1 ]
Zhang, Chao [2 ]
Liu, Qi [1 ]
机构
[1] Hubei Minzu Univ, Sch Math & Stat, Enshi 445000, Hubei, Peoples R China
[2] Shanxi Univ, Sch Comp & Informat Technol, Key Lab Computat Intelligence & Chinese Informat P, Taiyuan 030006, Shanxi, Peoples R China
关键词
<italic>k</italic>-means clustering; Multi-head self-attention mechanism; Feature Selection; LSTM; TRANSFORMER;
D O I
10.1007/s13042-024-02490-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As the demand for precise predictions grows across various industries due to advancements in sensor technology and computer hardware, multi-feature time series prediction shows significant promise in fields such as information fusion, finance, energy, and meteorology. However, traditional machine learning methods often struggle to forecast future events given the increasing complexity of the data. To address this challenge, the paper introduces an innovative approach that combines an improved k-means clustering with a multi-head self-attention mechanism. This method utilizes long and short-term memory (LSTM) neural networks to filter and identify the most effective feature subset for prediction. In the enhanced k-means clustering algorithm, a novel similarity formula named Feature Vector Similarity (FVS) and a method for automatically determining the number of clustering centers are proposed. This advancement improves the rationality of cluster center selection and enhances overall clustering performance. The multi-head self-attention mechanism calculates the clustering centers and attention weights of objects within the cluster partitions, optimizing feature selection and enhancing computational efficiency. The fusion of k-means clustering, the multi-head self-attention mechanism, and LSTM networks results in a new feature selection method, referred to as KMAL. To further refine the prediction process, we integrate KMAL with LSTM, known for its strong performance in predicting long-term time series, to develop a novel prediction model: KMAL-LSTM. In the subsequent comparative experiments, the prediction performance of the models is assessed using mean absolute error (MAE), mean bias error (MBE), and root mean square error (RMSE). The proposed KMAL-LSTM model consistently exhibits superior validity, stability, and performance when compared to seven other prediction models across six distinct datasets.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] A Bearing Fault Diagnosis Method Based on Dilated Convolution and Multi-Head Self-Attention Mechanism
    Hou, Peng
    Zhang, Jianjie
    Jiang, Zhangzheng
    Tang, Yiyu
    Lin, Ying
    APPLIED SCIENCES-BASEL, 2023, 13 (23):
  • [42] Fusion of multivariate time series meteorological and static soil data for multistage crop yield prediction using multi-head self attention network
    Kaur, Arshveer
    Goyal, Poonam
    Rajhans, Rohit
    Agarwal, Lakshya
    Goyal, Navneet
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 226
  • [43] Detection of malicious URLs using Temporal Convolutional Network and Multi-Head Self-Attention mechanism
    Nguyet Quang Do
    Selamat, Ali
    Krejcar, Ondrej
    Fujita, Hamido
    APPLIED SOFT COMPUTING, 2025, 169
  • [44] Multi-Modal Fusion Network with Multi-Head Self-Attention for Injection Training Evaluation in Medical Education
    Li, Zhe
    Kanazuka, Aya
    Hojo, Atsushi
    Nomura, Yukihiro
    Nakaguchi, Toshiya
    ELECTRONICS, 2024, 13 (19)
  • [45] The effect of the head number for multi-head self-attention in remaining useful life prediction of rolling bearing and interpretability
    Zhao, Qiwu
    Zhang, Xiaoli
    Wang, Fangzhen
    Fan, Panfeng
    Mbeka, Erick
    NEUROCOMPUTING, 2025, 616
  • [46] Multi-view Feature Fusion Based on Self-attention Mechanism for Drug-drug Interaction Prediction
    Han, Hui
    Zhang, Weiyu
    Sun, Xu
    Lu, Wenpeng
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [47] ATM-TCR: TCR-Epitope Binding Affinity Prediction Using a Multi-Head Self-Attention Model
    Cai, Michael
    Bang, Seojin
    Zhang, Pengfei
    Lee, Heewook
    FRONTIERS IN IMMUNOLOGY, 2022, 13
  • [48] TLS-MHSA: An Efficient Detection Model for Encrypted Malicious Traffic based on Multi-Head Self-Attention Mechanism
    Chen, Jinfu
    Song, Luo
    Cai, Saihua
    Xie, Haodi
    Yin, Shang
    Ahmad, Bilal
    ACM TRANSACTIONS ON PRIVACY AND SECURITY, 2023, 26 (04)
  • [49] A malicious network traffic detection model based on bidirectional temporal convolutional network with multi-head self-attention mechanism
    Cai, Saihua
    Xu, Han
    Liu, Mingjie
    Chen, Zhilin
    Zhang, Guofeng
    COMPUTERS & SECURITY, 2024, 136
  • [50] Multi-head self-attention mechanism combined with feedforward network for time-varying nonlinear digital self-interference cancellation
    Li, Xiang
    Ye, Fang
    Tian, Yuan
    Li, Yibing
    DIGITAL SIGNAL PROCESSING, 2024, 155