An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand

被引:87
|
作者
Zhang, Jinhua [1 ]
Wang, Baozeng [1 ]
Zhang, Cheng [1 ]
Xiao, Yanqing [2 ]
Wang, Michael Yu [1 ,3 ,4 ]
机构
[1] Xi An Jiao Tong Univ, Sch Mech Engn, Educ Minist Modern Design & Rotor Bearing Syst, Key Lab, Xian, Shaanxi, Peoples R China
[2] Beihang Univ, Sch Biol Sci & Med Engn, Beijing, Peoples R China
[3] Hong Kong Univ Sci & Technol, HKUST Robot Inst, Kowloon, Dept Mech & Aerosp Engn, Hong Kong, Peoples R China
[4] Hong Kong Univ Sci & Technol, HKUST Robot Inst, Kowloon, Dept Elect & Comp Engn, Hong Kong, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
electroencephalogram (EEG); electromyogram (EMG); electrooculogram (EOG); multimodal human-machine interface (mHMI); soft robot hand; BRAIN PLASTICITY; MOTOR IMAGERY; THERAPY; STROKE; EMG; REHABILITATION; CLASSIFICATION; DESIGN; HEAD; EOG;
D O I
10.3389/fnbot.2019.00007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Real-Time Hand Gesture Recognition With EMG Using Machine Learning
    Jaramillo, Andres G.
    Benalcazar, Marco E.
    2017 IEEE SECOND ECUADOR TECHNICAL CHAPTERS MEETING (ETCM), 2017,
  • [42] Real-Time Sensing of Trust in Human-Machine Interactions
    Hu, Wan-Lin
    Akash, Kumar
    Jain, Neera
    Reid, Tahira
    IFAC PAPERSONLINE, 2016, 49 (32): : 48 - 53
  • [43] Human-Machine Interaction for Real-time Linear Optimization
    Hamel, Simon
    Gaudreault, Jonathan
    Quimper, Claude-Guy
    Bouchard, Mathieu
    Marier, Philippe
    PROCEEDINGS 2012 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2012, : 673 - 680
  • [44] Real time posture estimation of human hand for robot hand interface
    Tanimoto, Takanobu
    Hoshino, Kiyoshi
    PROCEEDINGS OF THE SECOND INTERNATIONAL SYMPOSIUM ON UNIVERSAL COMMUNICATION, 2008, : 303 - 308
  • [45] Multimodal human-machine interface based on a Brain-Computer Interface and an electrooculography interface
    Ianez, Eduardo
    Ubeda, Andres
    Azorin, Jose M.
    2011 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2011, : 4572 - 4575
  • [46] Hybrid Human-Machine Interface for Gait Decoding Through Bayesian Fusion of EEG and EMG Classifiers
    Tortora, Stefano
    Tonin, Luca
    Chisari, Carmelo
    Micera, Silvestro
    Menegatti, Emanuele
    Artoni, Fiorenzo
    FRONTIERS IN NEUROROBOTICS, 2020, 14
  • [47] Real-time Vision-based Telepresence Robot Hand Control
    Deng, Chuanyun
    Lu, Jie
    Lam, Tin Lun
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS IEEE-ROBIO 2014, 2014, : 463 - 468
  • [48] Robot-Audition-Based Human-Machine Interface for a Car
    Nakadai, Kazuhiro
    Mizumoto, Takeshi
    Nakamura, Keisuke
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 6129 - 6136
  • [49] An EEG/EOG-based hybrid brain-neural computer interaction (BNCI) system to control an exoskeleton for the paralyzed hand
    Soekadar, Surjo R.
    Witkowski, Matthias
    Vitiello, Nicola
    Birbaumer, Niels
    BIOMEDICAL ENGINEERING-BIOMEDIZINISCHE TECHNIK, 2015, 60 (03): : 199 - 205
  • [50] EMG-based Human Machine Interface Control
    Patel, Aditya
    Ramsay, James
    Imtiaz, Mohammad
    Lu, Yufeng
    2019 12TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTION (HSI), 2019, : 127 - 131