A Lightweight Action Recognition Method for Deployable Embedded Devices for Human-Computer Interaction

被引:0
|
作者
Hu, Nanjie [1 ]
Wang, Ningyu [1 ]
Lin, Jie [1 ]
Fu, Qinghao [1 ]
Tan, Benying [1 ]
机构
[1] Guilin Univ Elect Technol, Sch Artificial Intelligence, Guilin, Peoples R China
关键词
action recognition; embedded devices; human-computer interaction; lightweight; temporal modeling;
D O I
10.1109/MCSoC60832.2023.00046
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In recent years, numerous researchers have proposed various solutions to address the challenges in action recognition. However, most existing approaches suffer from high computational requirements and significant memory usage, making them impractical for real-time deployment on embedded devices. This paper introduces a lightweight action recognition method suitable for deploying on embedded devices for human-computer interaction, denoted as LARMDED-HCI (Lightweight Action Recognition for Mobile and Deployable Human-Computer Interaction). Our method employs MobileNetV3 as the backbone network, significantly reducing computational load and parameter count. We incorporate the Temporal Shift Module (TSM) to model the temporal aspect of video frames, enabling the model to capture temporal features effectively. Additionally, a 1D temporal convolutional layer is introduced to enhance feature extraction in the temporal dimension, improving the model's ability to model temporal differences between adjacent frames. Experiments conducted on the Jester and Something-Something-V2 datasets demonstrate that our approach exhibits a noticeable competitive advantage in recognition accuracy compared to other methods. Furthermore, our method achieves satisfactory real-time performance when deployed on embedded devices, facilitating human-computer interaction on such platforms.
引用
收藏
页码:262 / 267
页数:6
相关论文
共 50 条
  • [1] Research on Action Recognition Method of Dance Video Image Based on Human-Computer Interaction
    Peng, FenTian
    Zhang, Hongkai
    SCIENTIFIC PROGRAMMING, 2021, 2021
  • [2] Human-computer interaction: Input devices
    Jacob, RJK
    ACM COMPUTING SURVEYS, 1996, 28 (01) : 177 - 179
  • [3] Emotion recognition for human-computer interaction
    Jianhua TAO
    虚拟现实与智能硬件(中英文), 2021, 3 (01) : 6 - 7
  • [4] Emotion recognition in human-computer interaction
    Fragopanagos, N
    Taylor, JG
    NEURAL NETWORKS, 2005, 18 (04) : 389 - 405
  • [5] Emotion recognition for human-computer interaction
    TAO, Jianhua
    Virtual Reality and Intelligent Hardware, 2021, 3 (01):
  • [6] Emotion recognition in human-computer interaction
    Cowie, R
    Douglas-Cowie, E
    Tsapatsoulis, N
    Votsis, G
    Kollias, S
    Fellenz, W
    Taylor, JG
    IEEE SIGNAL PROCESSING MAGAZINE, 2001, 18 (01) : 32 - 80
  • [7] THE METHOD FOR HUMAN-COMPUTER INTERACTION BASED ON HAND GESTURE RECOGNITION
    Raudonis, Vidas
    Jonaitis, Domas
    PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON ELECTRICAL AND CONTROL TECHNOLOGIES, 2013, : 45 - 49
  • [8] Hand Shape Recognition for Human-Computer Interaction
    Marnik, Joanna
    MAN-MACHINE INTERACTIONS, 2009, 59 : 95 - 102
  • [9] Implicit Human-Computer Interaction by Posture Recognition
    Maier, Enrico
    DIGITAL HUMAN MODELING, 2011, 6777 : 143 - 150
  • [10] Recognition of hand gesture to human-computer interaction
    Lee, LK
    Kim, S
    Choi, YK
    Lee, MH
    IECON 2000: 26TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, VOLS 1-4: 21ST CENTURY TECHNOLOGIES AND INDUSTRIAL OPPORTUNITIES, 2000, : 2117 - 2122