Real-Time Myocontrol of a Human-Computer Interface by Paretic Muscles After Stroke

被引:14
|
作者
Yang, Chun [1 ]
Long, Jinyi [2 ]
Urbin, M. A. [3 ]
Feng, Yanyun [4 ]
Song, Ge [1 ]
Weng, Jian [2 ]
Li, Zhijun [5 ]
机构
[1] South China Agr Univ, Coll Math & Informat, Guangzhou 510642, Guangdong, Peoples R China
[2] Jinan Univ, Coll Informat Sci & Technol, Guangzhou 510632, Guangdong, Peoples R China
[3] Univ Pittsburgh, Dept Phys Med & Rehabil, Pittsburgh, PA 15213 USA
[4] First Peoples Hosp Foshan, Dept Radiol, Foshan 528000, Peoples R China
[5] South China Univ Technol, Sch Automat Sci & Engn, Guangzhou 510640, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Electromyography; hand gesture recognition; hemiparesis; human-computer interface (HCI); rehabilitation; stroke; GESTURE RECOGNITION; SYNERGIES; MOVEMENT; SYSTEM;
D O I
10.1109/TCDS.2018.2830388
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Biosignals from skeletal muscle have been used to control human-machine interface. Signals from paretic muscles of humans with stroke are distorted and highly variable. Here, we examine the stability of surface electromyography (sEMG) features from paretic hand muscles to enable continuous, real-time multicommand control of a human-computer interface (HCI). Subjects with long standing cortical strokes (>6 months, n = 12) and neurologically intact controls (n = 12) performed two wrist rotations (wrist extend and wrist flex) and two grips (power grip and tine grip) with the nondominant (controls) or paretic (stroke patients) hand. Data reduction analyses revealed a distinct pattern of coactivation across muscles for each gesture. These synergies were similar for control and stroke groups and stable across sessions. Results of offline experiments involving wrist rotation and hand grips confirmed that gestures performed in isolation or combination were recognized at greater than chance level in both groups (p < 0.01). In online experiments, HCI control was evaluated with a balloon shooter game. Users in both groups were able to control the direction and speed of a simulated bullet to a balloon target with greater than chance-level accuracy (p < 0.01). Taken together, these results demonstrate that sEMG synergy features from paretic hand muscles can be used to drive continuous, real-time multicommand control of an HCI.
引用
收藏
页码:1126 / 1132
页数:7
相关论文
共 50 条
  • [1] Development of an Eye Tracking-Based Human-Computer Interface for Real-Time Applications
    Bozomitu, Radu Gabriel
    Pasarica, Alexandru
    Tarniceriu, Daniela
    Rotariu, Cristian
    [J]. SENSORS, 2019, 19 (16)
  • [2] A Real-Time Model-Based Human Motion Tracking and Analysis for Human-Computer Interface Systems
    Chung-Lin Huang
    Chia-Ying Chung
    [J]. EURASIP Journal on Advances in Signal Processing, 2004
  • [3] A real-time model-based human motion tracking and analysis for human-computer interface systems
    [J]. Huang, C.-L. (clhuang@ee.nthu.edu.tw), 1648, Hindawi Publishing Corporation (2004):
  • [4] A real-time model-based human motion tracking and analysis for human-computer interface systems
    Huang, CL
    Chung, CY
    [J]. EURASIP JOURNAL ON APPLIED SIGNAL PROCESSING, 2004, 2004 (11) : 1648 - 1662
  • [5] LOW-LATENCY HUMAN-COMPUTER AUDITORY INTERFACE BASED ON REAL-TIME VISION ANALYSIS
    Scalvini, Florian
    Bordeau, Camille
    Ambard, Maxime
    Migniot, Cyrille
    Dubois, Julien
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 36 - 40
  • [6] Real-Time Human-Computer Interaction Using Eye Gazes
    Chen, Haodong
    Zendehdel, Niloofar
    Leu, Ming C.
    Yin, Zhaozheng
    [J]. MANUFACTURING LETTERS, 2023, 35 : 883 - 894
  • [7] A Novel Real-Time Eye Detection in Human-Computer Interaction
    Yan Chao
    Wang Yuanqing
    Zhang Zhaoyang
    [J]. INNOVATIVE COMPUTING AND INFORMATION, PT II, 2011, 232 : 530 - +
  • [8] Real-Time Human-Computer Interaction Using Eye Gazes
    Chen, Haodong
    Zendehdel, Niloofar
    Leu, Ming C.
    Yin, Zhaozheng
    [J]. MANUFACTURING LETTERS, 2023, 35 : 883 - 894
  • [9] A Novel Real-Time Eye Detection in Human-Computer Interaction
    Yan, Chao
    Wang, Yuanqing
    Zhang, Zhaoyang
    [J]. 2010 SECOND INTERNATIONAL CONFERENCE ON E-LEARNING, E-BUSINESS, ENTERPRISE INFORMATION SYSTEMS, AND E-GOVERNMENT (EEEE 2010), VOL I, 2010, : 57 - 62
  • [10] Human-computer cooperation platform for developing real-time robotic applications
    Dominguez, Carlos
    Martinez, Juan-Miguel
    Busquets-Mataix, Jose, V
    Hassan, Houcine
    [J]. JOURNAL OF SUPERCOMPUTING, 2019, 75 (04): : 1849 - 1868