Phase lag index-based graph attention networks for detecting driving fatigue

被引:19
|
作者
Wang, Zhongmin [1 ,2 ]
Zhao, Yupeng [1 ]
He, Yan [1 ,2 ]
Zhang, Jie [1 ,2 ]
机构
[1] Xian Univ Posts & Telecommun, Sch Comp Sci & Technol, Xian 710121, Shaanxi, Peoples R China
[2] Xian Univ Posts & Telecommun, Shaanxi Key Lab Network Data Anal & Intelligent, Xian 710121, Shaanxi, Peoples R China
来源
REVIEW OF SCIENTIFIC INSTRUMENTS | 2021年 / 92卷 / 09期
基金
中国国家自然科学基金;
关键词
EEG;
D O I
10.1063/5.0056139
中图分类号
TH7 [仪器、仪表];
学科分类号
0804 ; 080401 ; 081102 ;
摘要
It is important to understand the changes in the characteristics of the brain network in the state of driving fatigue and to reveal the pattern of functional connectivity between brain regions when fatigue occurs. This paper proposes a method for the detection of driving fatigue based on electroencephalogram (EEG) signals using a phase lag index graph attention network (PLI-GAT). Phase synchronization between EEG signals is a key attribute for establishing communication links among different regions of the brain, and so, the PLI is used to construct a functional brain network reflecting the relationship between EEG signals from different channels. Multi-channel EEG time-frequency features are then modeled as graph data, and the driving fatigue monitoring model is trained using a GAT. Compared with traditional graph neural networks, the GAT applies an aggregation operation to adjacent EEG channel features through the attention mechanism. This enables the adaptive assignment of different neighbor weights, which greatly improves the expressiveness of the graph neural network model. The proposed method is validated on the publicly available SEED-VIG dataset, and the accuracy of fatigue state recognition is found to reach 85.53%. The results show that the functional connectivity among different channels is significantly enhanced in the fatigue state. Published under an exclusive license by AIP Publishing. https://doi.org/10.1063/5.0056139
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Revisiting Attention-Based Graph Neural Networks for Graph Classification
    Tao, Ye
    Li, Ying
    Wu, Zhonghai
    PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XVII, PPSN 2022, PT I, 2022, 13398 : 442 - 458
  • [32] Attention based spatiotemporal graph attention networks for traffic flow forecasting
    Wang, Yi
    Jing, Changfeng
    Xu, Shishuo
    Guo, Tao
    INFORMATION SCIENCES, 2022, 607 : 869 - 883
  • [33] Attention-based graph neural networks: a survey
    Chengcheng Sun
    Chenhao Li
    Xiang Lin
    Tianji Zheng
    Fanrong Meng
    Xiaobin Rui
    Zhixiao Wang
    Artificial Intelligence Review, 2023, 56 : 2263 - 2310
  • [34] Graph Convolutional Networks with Motif-based Attention
    Lee, John Boaz
    Rossi, Ryan A.
    Kong, Xiangnan
    Kim, Sungchul
    Koh, Eunyee
    Rao, Anup
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 499 - 508
  • [35] Path reliability-based graph attention networks
    Li, Yayang
    Liang, Shuqing
    Jiang, Yuncheng
    NEURAL NETWORKS, 2023, 159 : 153 - 160
  • [36] MGATs: Motif-Based Graph Attention Networks
    Sheng, Jinfang
    Zhang, Yufeng
    Wang, Bin
    Chang, Yaoxing
    MATHEMATICS, 2024, 12 (02)
  • [37] Predicting Propositional Satisfiability Based on Graph Attention Networks
    Chang, Wenjing
    Zhang, Hengkai
    Luo, Junwei
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2022, 15 (01)
  • [38] Predicting Propositional Satisfiability Based on Graph Attention Networks
    Wenjing Chang
    Hengkai Zhang
    Junwei Luo
    International Journal of Computational Intelligence Systems, 15
  • [39] Mahalanobis Distance-Based Graph Attention Networks
    Mardani, Konstantina
    Vretos, Nicholas
    Daras, Petros
    IEEE ACCESS, 2024, 12 : 166923 - 166935
  • [40] Attention-based graph neural networks: a survey
    Sun, Chengcheng
    Li, Chenhao
    Lin, Xiang
    Zheng, Tianji
    Meng, Fanrong
    Rui, Xiaobin
    Wang, Zhixiao
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 2263 - 2310