Riding feeling recognition based on multi-head self-attention LSTM for driverless automobile

被引:1
|
作者
Tang, Xianzhi [1 ]
Xie, Yongjia [1 ]
Li, Xinlong [1 ]
Wang, Bo [1 ]
机构
[1] Yanshan Univ, Sch Vehicles & Energy, Hebei Key Lab Special Carrier Equipment, Hebei St, Qinhuangdao 066004, Hebei, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalography (EEG); Attention; Feature extraction; Driving experience;
D O I
10.1016/j.patcog.2024.111135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the emergence of driverless technology, passenger ride comfort has become an issue of concern. In recent years, driving fatigue detection and braking sensation evaluation based on EEG signals have received more attention, and analyzing ride comfort using EEG signals is also a more intuitive method. However, it is still a challenge to find an effective method or model to evaluate passenger comfort. In this paper, we propose a longand short-term memory network model based on a multiple self-attention mechanism for passenger comfort detection. By applying the multiple attention mechanism to the feature extraction process, more efficient classification results are obtained. The results show that the long- and short-term memory network using the multihead self-attention mechanism is efficient in decision making along with higher classification accuracy. In conclusion, the classifier based on the multi-head attention mechanism proposed in this paper has excellent performance in EEG classification of different emotional states, and has a broad development prospect in braincomputer interaction.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] An adaptive multi-head self-attention coupled with attention filtered LSTM for advanced scene text recognition
    Selvam, Prabu
    Kumar, S. N.
    Kannadhasan, S.
    INTERNATIONAL JOURNAL ON DOCUMENT ANALYSIS AND RECOGNITION, 2025,
  • [2] Lip Recognition Based on Bi-GRU with Multi-Head Self-Attention
    Ni, Ran
    Jiang, Haiyang
    Zhou, Lu
    Lu, Yuanyao
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, PT III, AIAI 2024, 2024, 713 : 99 - 110
  • [3] The sentiment analysis model with multi-head self-attention and Tree-LSTM
    Li Lei
    Pei Yijian
    Jin Chenyang
    SIXTH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION, 2021, 11913
  • [4] Local Multi-Head Channel Self-Attention for Facial Expression Recognition
    Pecoraro, Roberto
    Basile, Valerio
    Bono, Viviana
    INFORMATION, 2022, 13 (09)
  • [5] Adaptive Pruning for Multi-Head Self-Attention
    Messaoud, Walid
    Trabelsi, Rim
    Cabani, Adnane
    Abdelkefi, Fatma
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 48 - 57
  • [6] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    PLOS ONE, 2024, 19 (06):
  • [7] ViolenceNet: Dense Multi-Head Self-Attention with Bidirectional Convolutional LSTM for Detecting Violence
    Rendon-Segador, Fernando J.
    Alvarez-Garcia, Juan A.
    Enriquez, Fernando
    Deniz, Oscar
    ELECTRONICS, 2021, 10 (13)
  • [8] A Supervised Multi-Head Self-Attention Network for Nested Named Entity Recognition
    Xu, Yongxiu
    Huang, Heyan
    Feng, Chong
    Hu, Yue
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14185 - 14193
  • [9] Fast Neural Chinese Named Entity Recognition with Multi-head Self-attention
    Qi, Tao
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Liu, Junxin
    Huang, Yongfeng
    Xie, Xing
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE COMPUTING AND LANGUAGE UNDERSTANDING, 2019, 1134 : 98 - 110
  • [10] DILATED RESIDUAL NETWORK WITH MULTI-HEAD SELF-ATTENTION FOR SPEECH EMOTION RECOGNITION
    Li, Runnan
    Wu, Zhiyong
    Jia, Jia
    Zhao, Sheng
    Meng, Helen
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 6675 - 6679