Human behavior recognition based on sparse transformer with channel attention mechanism

被引:0
|
作者
Cao, Keyan [1 ]
Wang, Mingrui [1 ]
机构
[1] Shenyang Jianzhu Univ, Sch Comp Sci & Engn, Shenyang, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
human activity recognition; wearable biosensors; sparse transformer; attention; time series;
D O I
10.3389/fphys.2023.1239453
中图分类号
Q4 [生理学];
学科分类号
071003 ;
摘要
Human activity recognition (HAR) has recently become a popular research field in the wearable sensor technology scene. By analyzing the human behavior data, some disease risks or potential health issues can be detected, and patients' rehabilitation progress can be evaluated. With the excellent performance of Transformer in natural language processing and visual tasks, researchers have begun to focus on its application in time series. The Transformer model models long-term dependencies between sequences through self-attention mechanisms, capturing contextual information over extended periods. In this paper, we propose a hybrid model based on the channel attention mechanism and Transformer model to improve the feature representation ability of sensor-based HAR tasks. Extensive experiments were conducted on three public HAR datasets, and the results show that our network achieved accuracies of 98.10%, 97.21%, and 98.82% on the HARTH, PAMAP2, and UCI-HAR datasets, respectively, The overall performance is at the level of the most advanced methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Research on Modulation Recognition Algorithm Based on Channel and Spatial Self-Attention Mechanism
    Zhang, Wenna
    Sun, Yunqiang
    Xue, Kailiang
    Yao, Aiqin
    IEEE ACCESS, 2023, 11 : 68617 - 68631
  • [32] Steel surface defect detection based on sparse global attention transformer
    Yinghao Li
    Zhiyong Han
    Wenmeng Wang
    Heping Xu
    Yongpeng Wei
    Guangjun Zai
    Pattern Analysis and Applications, 2024, 27 (4)
  • [33] Adaptive sparse attention-based compact transformer for object tracking
    Pan, Fei
    Zhao, Lianyu
    Wang, Chenglin
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [34] Layer Sparse Transformer for Speech Recognition
    Wang, Peng
    Guo, Zhiyuan
    Xie, Fei
    2023 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH, ICKG, 2023, : 269 - 273
  • [35] SSA: A Content-Based Sparse Attention Mechanism
    Sun, Yang
    Hu, Wei
    Liu, Fang
    Huang, Feihu
    Wang, Yonghao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2022, PT III, 2022, 13370 : 669 - 680
  • [36] WiSigPro: Transformer for elevating CSI-based human activity recognition through attention mechanisms
    Hussain, Abid
    Chen, Yueshan
    Ullah, Arif
    Zhang, Sihai
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 258
  • [37] Patches Channel Attention for Human Sitting Posture Recognition
    Ye, Yongfang
    Shi, Shoudong
    Zhao, Tianxiang
    Qiu, Kedi
    Lan, Ting
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 358 - 370
  • [38] Dynamic Sparse Attention for Scalable Transformer Acceleration
    Liu, Liu
    Qu, Zheng
    Chen, Zhaodong
    Tu, Fengbin
    Ding, Yufei
    Xie, Yuan
    IEEE TRANSACTIONS ON COMPUTERS, 2022, 71 (12) : 3165 - 3178
  • [39] Human Behavior Analysis Based on Attention Mechanism and LSTM Neural Network
    Hao, Ziqiang
    Liu, Meng
    Wang, Zhongyuan
    Zhan, Weida
    PROCEEDINGS OF 2019 IEEE 9TH INTERNATIONAL CONFERENCE ON ELECTRONICS INFORMATION AND EMERGENCY COMMUNICATION (ICEIEC 2019), 2019, : 346 - 349
  • [40] Weak-Attention Suppression For Transformer Based Speech Recognition
    Shi, Yangyang
    Wang, Yongqiang
    Wu, Chunyang
    Fuegen, Christian
    Zhang, Frank
    Le, Duc
    Yeh, Ching-Feng
    Seltzer, Michael L.
    INTERSPEECH 2020, 2020, : 4996 - 5000