A Multimodal Human-Robot Interface to Drive a Neuroprosthesis for Tremor Management

被引:31
|
作者
Alvaro Gallego, Juan [1 ]
Ibanez, Jaime [1 ]
Dideriksen, Jakob Lund [2 ]
Ignacio Serrano, Jose [1 ]
Dolores del Castillo, Maria [1 ]
Farina, Dario [3 ]
Rocon, Eduardo [1 ]
机构
[1] CSIC, Bioengn Grp, E-28006 Madrid, Spain
[2] Aalborg Univ, Ctr Sensory Motor Interact, Dept Hlth Sci & Technol, DK-9220 Aalborg, Denmark
[3] Univ Gottingen, Dept Neurorehabil Engn, Bernstein Focus Neurotechnol Goettingen, Bernstein Ctr Computat Neurosci, D-37073 Gottingen, Germany
关键词
Electroencephalography; electromyography; neural engineering; sensor fusion; EVENT-RELATED DESYNCHRONIZATION; PATHOLOGICAL TREMORS; PHYSIOLOGICAL TREMOR; VOLUNTARY MOVEMENT; AMPLITUDE; SYNCHRONIZATION; CLASSIFICATION; STIMULATION;
D O I
10.1109/TSMCC.2012.2200101
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tremor is the most prevalentmovement disorder, and its incidence is increasing with aging. In spite of the numerous therapeutic solutions available, 65% of those suffering from upper limb tremor report serious difficulties during their daily living. This gives rise to research on different treatment alternatives, amongst which wearable robots that apply selective mechanical loads constitute an appealing approach. In this context, the current work presents a multimodal human-robot interface to drive a neuroprosthesis for tremor management. Our approach relies on the precise characterization of the tremor to modulate a functional electrical stimulation system that compensates for it. The neuroprosthesis is triggered by the detection of the intention to move derived from the analysis of electroencephalographic activity, which provides a natural interface with the user. When a prediction is delivered, surface electromyography serves to detect the actual onset of the tremor in the presence of volitional activity. This information in turn triggers the stimulation, which relies on tremor parameters-amplitude and frequency-derived from a pair of inertial sensors that record the kinematics of the affected joint. Surface electromyography also yields a first characterization of the tremor, together with precise information on the preferred stimulation site. Apart from allowing for an optimized performance of the system, our multimodal approach permits the implementation of redundant methods to both enhance the reliability of the system and adapt to the specific needs of different users. Results with a representative group of patients illustrate the performance of the interface presented here and demonstrate its feasibility.
引用
收藏
页码:1159 / 1168
页数:10
相关论文
共 50 条
  • [1] Building a multimodal human-robot interface
    Perzanowski, D
    Schultz, AC
    Adams, W
    Marsh, E
    Bugajska, M
    [J]. IEEE INTELLIGENT SYSTEMS & THEIR APPLICATIONS, 2001, 16 (01): : 16 - 21
  • [2] Multimodal Interface for Human-Robot Collaboration
    Rautiainen, Samu
    Pantano, Matteo
    Traganos, Konstantinos
    Ahmadi, Seyedamir
    Saenz, Jose
    Mohammed, Wael M.
    Lastra, Jose L. Martinez
    [J]. MACHINES, 2022, 10 (10)
  • [3] A multimodal teleoperation interface for human-robot collaboration
    Si, Weiyong
    Zhong, Tianjian
    Wang, Ning
    Yang, Chenguang
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS, ICM, 2023,
  • [4] Improving Human-Robot Interaction by a Multimodal Interface
    Ubeda, Andres
    Ianez, Eduardo
    Azorin, Jose M.
    Sabater, Jose M.
    Garcia, Nicolas M.
    Perez, Carlos
    [J]. IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010, : 3580 - 3585
  • [5] Designing a Multimodal Human-Robot Interaction Interface for an Industrial Robot
    Mocan, Bogdan
    Fulea, Mircea
    Brad, Stelian
    [J]. ADVANCES IN ROBOT DESIGN AND INTELLIGENT CONTROL, 2016, 371 : 255 - 263
  • [6] Mobile Multimodal Human-Robot Interface for Virtual Collaboration
    Song, Young Eun
    Niitsuma, Mihoko
    Kubota, Takashi
    Hashimoto, Hideki
    Son, Hyoung Il
    [J]. 3RD IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM 2012), 2012, : 627 - 631
  • [7] Human-Robot Interaction and Collaborative Manipulation with Multimodal Perception Interface for Human
    Huang, Shouren
    Ishikawa, Masatoshi
    Yamakawa, Yuji
    [J]. PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON HUMAN-AGENT INTERACTION (HAI'19), 2019, : 289 - 291
  • [8] A Gesture-based Multimodal Interface for Human-Robot Interaction
    Uimonen, Mikael
    Kemppi, Paul
    Hakanen, Taru
    [J]. 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 165 - 170
  • [9] Haptic interface with multimodal tactile sensing and feedback for human-robot interaction
    Kang, Mingyu
    Gang, Cheol-Gu
    Ryu, Sang-Kyu
    Kim, Hyeon-Ju
    Jeon, Da-Yeon
    Pyo, Soonjae
    [J]. MICRO AND NANO SYSTEMS LETTERS, 2024, 12 (01)
  • [10] eEVA as a Real-Time Multimodal Agent Human-Robot Interface
    Pena, P.
    Polceanu, M.
    Lisetti, C.
    Visser, U.
    [J]. ROBOT WORLD CUP XXII, ROBOCUP 2018, 2019, 11374 : 262 - 274