An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand

被引:87
|
作者
Zhang, Jinhua [1 ]
Wang, Baozeng [1 ]
Zhang, Cheng [1 ]
Xiao, Yanqing [2 ]
Wang, Michael Yu [1 ,3 ,4 ]
机构
[1] Xi An Jiao Tong Univ, Sch Mech Engn, Educ Minist Modern Design & Rotor Bearing Syst, Key Lab, Xian, Shaanxi, Peoples R China
[2] Beihang Univ, Sch Biol Sci & Med Engn, Beijing, Peoples R China
[3] Hong Kong Univ Sci & Technol, HKUST Robot Inst, Kowloon, Dept Mech & Aerosp Engn, Hong Kong, Peoples R China
[4] Hong Kong Univ Sci & Technol, HKUST Robot Inst, Kowloon, Dept Elect & Comp Engn, Hong Kong, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
electroencephalogram (EEG); electromyogram (EMG); electrooculogram (EOG); multimodal human-machine interface (mHMI); soft robot hand; BRAIN PLASTICITY; MOTOR IMAGERY; THERAPY; STROKE; EMG; REHABILITATION; CLASSIFICATION; DESIGN; HEAD; EOG;
D O I
10.3389/fnbot.2019.00007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] An EOG-Based Human-Machine Interface for Wheelchair Control
    Huang, Qiyun
    He, Shenghong
    Wang, Qihong
    Gu, Zhenghui
    Peng, Nengneng
    Li, Kai
    Zhang, Yuandong
    Shao, Ming
    Li, Yuanqing
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2018, 65 (09) : 2023 - 2032
  • [2] Real-Time EEG-EMG Human-Machine Interface-Based Control System for a Lower-Limb Exoskeleton
    Gordleeva, Susanna Yu
    Lobov, Sergey A.
    Grigorev, Nikita A.
    Savosenkov, Andrey O.
    Shamshin, Maxim O.
    Lukoyanov, Maxim, V
    Khoruzhko, Maxim A.
    Kazantsev, Victor B.
    IEEE ACCESS, 2020, 8 : 84070 - 84081
  • [3] EOG/ERP Hybrid Human-Machine Interface for Robot Control
    Ma, Jiaxin
    Zhang, Yu
    Nam, Yunjun
    Cichocki, Andrzej
    Matsuno, Fumitoshi
    2013 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2013, : 859 - 864
  • [4] Effects of output speed threshold on real-time continuous EMG human-machine interface control
    Chung, Sang Hun
    Crouch, Dustin L.
    Huang, He
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 1375 - 1380
  • [5] An EOG-Based Human-Machine Interface to Control a Smart Home Environment for Patients With Severe Spinal Cord Injuries
    Zhang, Rui
    He, Shenghong
    Yang, Xinghua
    Wang, Xiaoyun
    Li, Kai
    Huang, Qiyun
    Yu, Zhuliang
    Zhang, Xichun
    Tang, Dan
    Li, Yuanqing
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2019, 66 (01) : 89 - 100
  • [6] A Novel EOG/EEG Hybrid Human-Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control
    Ma, Jiaxin
    Zhang, Yu
    Cichocki, Andrzej
    Matsuno, Fumitoshi
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2015, 62 (03) : 876 - 889
  • [7] Real-time EMG-based Human Machine Interface Using Dynamic Hand Gestures
    Shin, Sungtae
    Tafreshi, Reza
    Langari, Reza
    2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 5456 - 5461
  • [8] Application of EEG for Multimodal Human-Machine Interface
    Park, Jangwoo
    Woo, Il
    Park, Shinsuk
    2012 12TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS), 2012, : 1869 - 1873
  • [9] A Human-Machine Interface Based on an EOG and a Gyroscope for Humanoid Robot Control and Its Application to Home Services
    Wang, Fan
    Li, Xiongzi
    Pan, Jiahui
    JOURNAL OF HEALTHCARE ENGINEERING, 2022, 2022
  • [10] Real-time humanoid avatar for multimodal human-machine interaction
    Fu, Yun
    Li, Renxiang
    Huang, Thomas S.
    Danielsen, Mike
    2007 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOLS 1-5, 2007, : 991 - +