Brain-computer interface for robot control with eye artifacts for assistive applications

被引:6
|
作者
Karas, Kaan [1 ]
Pozzi, Luca [1 ]
Pedrocchi, Alessandra [2 ]
Braghin, Francesco [1 ]
Roveda, Loris [3 ]
机构
[1] Politecn Milan, Dept Mech Engn, Via La Masa 1, I-20156 Milan, Italy
[2] Politecn Milan, Dept Elect Informat & Bioengn, Nearlab, Via Giuseppe Colombo 40, I-20133 Milan, Italy
[3] Univ Svizzera Italiana USI, Ist Dalle Molle Intelligenza Artificiale IDSIA, Scuola Univ Profess Svizzera Italiana SUPSI, Via La Santa 1, CH-6962 Lugano, Switzerland
关键词
WHEELCHAIR; MOVEMENTS; EOG; ARM;
D O I
10.1038/s41598-023-44645-y
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Human-robot interaction is a rapidly developing field and robots have been taking more active roles in our daily lives. Patient care is one of the fields in which robots are becoming more present, especially for people with disabilities. People with neurodegenerative disorders might not consciously or voluntarily produce movements other than those involving the eyes or eyelids. In this context, Brain-Computer Interface (BCI) systems present an alternative way to communicate or interact with the external world. In order to improve the lives of people with disabilities, this paper presents a novel BCI to control an assistive robot with user's eye artifacts. In this study, eye artifacts that contaminate the electroencephalogram (EEG) signals are considered a valuable source of information thanks to their high signal-to-noise ratio and intentional generation. The proposed methodology detects eye artifacts from EEG signals through characteristic shapes that occur during the events. The lateral movements are distinguished by their ordered peak and valley formation and the opposite phase of the signals measured at F7 and F8 channels. This work, as far as the authors' knowledge, is the first method that used this behavior to detect lateral eye movements. For the blinks detection, a double-thresholding method is proposed by the authors to catch both weak blinks as well as regular ones, differentiating itself from the other algorithms in the literature that normally use only one threshold. Real-time detected events with their virtual time stamps are fed into a second algorithm, to further distinguish between double and quadruple blinks from single blinks occurrence frequency. After testing the algorithm offline and in realtime, the algorithm is implemented on the device. The created BCI was used to control an assistive robot through a graphical user interface. The validation experiments including 5 participants prove that the developed BCI is able to control the robot.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Brain-computer interface for robot control with eye artifacts for assistive applications
    Kaan Karas
    Luca Pozzi
    Alessandra Pedrocchi
    Francesco Braghin
    Loris Roveda
    [J]. Scientific Reports, 13
  • [2] Robot control with multitasking of brain-computer interface
    Zhou, Yajun
    Lu, Zilin
    Li, Yuanqing
    [J]. 2022 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2022, : 155 - 160
  • [3] A Hybrid Brain-Computer Interface Implementation for the Control of an Assistive Device
    Ianosi-Andreeva-Dimitrova, Alexandru
    Mandru, Dan Silviu
    Abrudean, Adrian
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON E-HEALTH AND BIOENGINEERING CONFERENCE (EHB), 2017, : 543 - 546
  • [4] SSVEP Based Brain-Computer Interface for Robot Control
    Ortner, Rupert
    Guger, Christoph
    Prueckl, Robert
    Gruenbacher, Engelbert
    Edlinger, Guenter
    [J]. COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, PROCEEDINGS, PT 2, 2010, 6180 : 85 - 90
  • [5] Real-Time Detection and Filtering of Eye Blink Related Artifacts for Brain-Computer Interface Applications
    Binias, Bartosz
    Palus, Henryk
    Jaskot, Krzysztof
    [J]. MAN-MACHINE INTERACTIONS 4, ICMMI 2015, 2016, 391 : 281 - 290
  • [6] EEG-Based Brain-Computer Interface for Control of Assistive Devices
    Kapralov, Nikolay, V
    Ekimovskii, Jaroslav, V
    Potekhin, Vyacheslav V.
    [J]. CYBER-PHYSICAL SYSTEMS AND CONTROL, 2020, 95 : 536 - 543
  • [7] Development of a Motor Imagery Based Brain-computer Interface for Humanoid Robot Control Applications
    Prakaksita, Narendra
    Kuo, Chen-Yun
    Kuo, Chung-Hsien
    [J]. PROCEEDINGS 2016 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT), 2016, : 1607 - 1613
  • [8] A Brain-Computer Interface for Robot Navigation
    Nawroj, Ahsan I.
    Wang, Siyuan
    Yu, Yih-Choung
    Gabel, Lisa
    [J]. 2012 38TH ANNUAL NORTHEAST BIOENGINEERING CONFERENCE (NEBEC), 2012, : 15 - +
  • [9] Developing an Optical Brain-Computer Interface for Humanoid Robot Control
    Batula, Alyssa M.
    Mark, Jesse
    Kim, Youngmoo E.
    Ayaz, Hasan
    [J]. FOUNDATIONS OF AUGMENTED COGNITION: NEUROERGONOMICS AND OPERATIONAL NEUROSCIENCE, AC 2016, PT I, 2016, 9743 : 3 - 13
  • [10] Control of a humanoid robot by a noninvasive brain-computer interface in humans
    Bell, Christian J.
    Shenoy, Pradeep
    Chalodhorn, Rawichote
    Rao, Rajesh P. N.
    [J]. JOURNAL OF NEURAL ENGINEERING, 2008, 5 (02) : 214 - 220