Cognitive Workload Estimation Using Variational Autoencoder and Attention-Based Deep Model

被引:6
|
作者
Chakladar, Debashis Das [1 ]
Datta, Sumalyo [2 ]
Roy, Partha Pratim [1 ]
Prasad, Vinod A. [2 ]
机构
[1] Indian Inst Technol Roorkee, Dept Comp Sci & Engn, Roorkee 247667, India
[2] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
关键词
Convolutional block attention module (CBAM); convolutional neural network (CNN); electroencephalogram (EEG); long short-term memory (LSTM); variational autoen-coder (VAE); CLASSIFICATION;
D O I
10.1109/TCDS.2022.3163020
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The estimation of cognitive workload using electroencephalogram (EEG) is an emerging research area. However, due to poor spatial resolution issues, features obtained from EEG signals often lead to poor classification results. As a good generative model, the variational autoencoder (VAE) extracts the noise-free robust features from the latent space that lead to better classification performance. The spatial attention-based method [convolutional block attention module (CBAM)] can improve the spatial resolution of EEG signals. In this article, we propose an effective VAE-CBAM-based deep model for estimating cognitive states from topographical videos. Topographical videos of four different conditions [baseline (BL), low workload (LW), medium workload (MW), and high workload (HW)] of the mental arithmetic task are taken for the experiment. Initially, the VAE extracts localized features from input images (extracted from topographical video), and CBAM infers the spatial-channel-level's attention features from those localized features. Finally, the deep CNN-BLSTM model effectively learns those attention-based spatial features in a timely distributed manner to classify the cognitive state. For four-class and two-class classifications, the proposed model achieves 83.13% and 92.09% classification accuracy, respectively. The proposed model enhances the future research scope of attention-based studies in EEG applications.
引用
收藏
页码:581 / 590
页数:10
相关论文
共 50 条
  • [11] An Automatic Grading Model for Semantic Complexity of English Texts Using Bidirectional Attention-Based Autoencoder
    Chen, Ruo Han
    Ng, Boon Sim
    Paramasivam, Shamala
    Ren, Li
    [J]. JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2024,
  • [12] MADC: Multi-scale Attention-based Deep Clustering for Workload Prediction
    Huang, Jiaming
    Xiao, Chuming
    Wu, Weigang
    Yin, Ye
    Chang, Hongli
    [J]. 19TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2021), 2021, : 316 - 323
  • [13] Attention-Based Variational Autoencoder Models for Human-Human Interaction Recognition via Generation
    Banerjee, Bonny
    Baruah, Murchana
    [J]. SENSORS, 2024, 24 (12)
  • [14] An Intelligent Model of Anomaly Detection and Anomaly Localization in Images Using Hybrid Heuristic Adaptive Multiscale Attention-Based DenseNet and Cascaded Variational Autoencoder
    Jenifer, L. Leena
    Devaki, K.
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2024, 38 (11)
  • [15] Spatiotemporal estimation of PM2.5 using attention-based deep neural network
    Chen, Binjie
    Ye, Yang
    Lin, Yi
    You, Shixue
    Deng, Jinsong
    Yang, Wu
    Wang, Ke
    [J]. National Remote Sensing Bulletin, 2022, 26 (05) : 1027 - 1038
  • [16] Attention-Based Deep Recurrent Model for Survival Prediction
    Sun Z.
    Dong W.
    Shi J.
    He K.
    Huang Z.
    [J]. ACM Transactions on Computing for Healthcare, 2021, 2 (04):
  • [17] A deep-learning-based unsupervised model on esophageal manometry using variational autoencoder
    Kou, Wenjun
    Carlson, Dustin A.
    Baumann, Alexandra J.
    Donnan, Erica
    Luo, Yuan
    Pandolfino, John E.
    Etemadi, Mozziyar
    [J]. ARTIFICIAL INTELLIGENCE IN MEDICINE, 2021, 112
  • [18] Deep learning-based pulsar candidate identification model using a variational autoencoder
    Liu, Yi
    Jin, Jing
    Zhao, Hongyang
    [J]. NEW ASTRONOMY, 2024, 106
  • [19] WATuning: A Workload-Aware Tuning System with Attention-Based Deep Reinforcement Learning
    Jia-Ke Ge
    Yan-Feng Chai
    Yun-Peng Chai
    [J]. Journal of Computer Science and Technology, 2021, 36 : 741 - 761
  • [20] WATuning: A Workload-Aware Tuning System with Attention-Based Deep Reinforcement Learning
    Ge, Jia-Ke
    Chai, Yan-Feng
    Chai, Yun-Peng
    [J]. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2021, 36 (04) : 741 - 761