An EEG & eye-tracking dataset of ALS patients & healthy people during eye-tracking-based spelling system usage

被引:0
|
作者
Ngo, Thi Duyen [1 ]
Kieu, Hai Dang [1 ]
Nguyen, Minh Hoa [1 ]
Nguyen, The Hoang-Anh [2 ]
Can, Van Mao [3 ]
Nguyen, Ba Hung [3 ]
Le, Thanh Ha [1 ]
机构
[1] Vietnam Natl Univ, Univ Engn & Technol, Hanoi, Vietnam
[2] Minist Sci & Technol, Vietnam Korea Inst Sci & Technol, Hanoi, Vietnam
[3] Vietnam Mil Med Univ, Hanoi, Vietnam
关键词
BRAIN-COMPUTER INTERFACES; COMMUNICATION;
D O I
10.1038/s41597-024-03501-y
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
This research presents a dataset consisting of electroencephalogram and eye tracking recordings obtained from six patients with amyotrophic lateral sclerosis (ALS) in a locked-in state and one hundred seventy healthy individuals. The ALS patients exhibited varying degrees of disease progression, ranging from partial mobility and weakened speech to complete paralysis and loss of speech. Despite these physical impairments, the ALS patients retained good eye function, which allowed them to use a virtual keyboard for communication. Data from ALS patients was recorded multiple times at their homes, while data from healthy individuals was recorded once in a laboratory setting. For each data recording, the experimental design involved nine recording sessions per participant, each corresponding to a common human action or demand. This dataset can serve as a valuable benchmark for several applications, such as improving spelling systems with brain-computer interfaces, investigating motor imagination, exploring motor cortex function, monitoring motor impairment progress in patients undergoing rehabilitation, and studying the effects of ALS on cognitive and motor processes.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] EEG and Eye-Tracking Based Measures for Enhanced Training
    Soussou, Walid
    Rooksby, Michael
    Forty, Charles
    Weatherhead, James
    Marshall, Sandra
    [J]. 2012 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2012, : 1623 - 1626
  • [2] Smart Eye-Tracking System
    Juhong, Aniwat
    Treebupachatsakul, T.
    Pintavirooj, C.
    [J]. 2018 INTERNATIONAL WORKSHOP ON ADVANCED IMAGE TECHNOLOGY (IWAIT), 2018,
  • [3] COLET: A dataset for COgnitive workLoad estimation based on eye-tracking
    Ktistakis, Emmanouil
    Skaramagkas, Vasileios
    Manousos, Dimitris
    Tachos, Nikolaos S.
    Tripoliti, Evanthia
    Fotiadis, Dimitrios I.
    Tsiknakis, Manolis
    [J]. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2022, 224
  • [4] Exploring Eye-Tracking-Based Detection of Visual Search for Elderly People
    Dietz, Michael
    Schork, Daniel
    Andre, Elisabeth
    [J]. 12TH INTERNATIONAL CONFERENCE ON INTELLIGENT ENVIRONMENTS - IE 2016, 2016, : 151 - 154
  • [5] A laser-based eye-tracking system
    Irie, K
    Wilson, BA
    Jones, RD
    Bones, PJ
    Anderson, TJ
    [J]. BEHAVIOR RESEARCH METHODS INSTRUMENTS & COMPUTERS, 2002, 34 (04): : 561 - 572
  • [6] A Survey on the Usage of Eye-Tracking in Computer Programming
    Obaidellah, Unaizah
    Al Haek, Mohammed
    Cheng, Peter C. -H.
    [J]. ACM COMPUTING SURVEYS, 2018, 51 (01)
  • [7] A laser-based eye-tracking system
    Kenji Irie
    Bruce A. Wilson
    Richard D. Jones
    Philip J. Bones
    Tim J. Anderson
    [J]. Behavior Research Methods, Instruments, & Computers, 2002, 34 : 561 - 572
  • [8] Eye-tracking-based smart home controller
    Wang P.
    Chen Y.-Y.
    Shao M.-L.
    Liu B.
    Zhang W.-C.
    [J]. Wang, Peng, 1600, Editorial Department of Electric Machines and Control (24): : 151 - 160
  • [9] Development of a System to Detect Eye Position Abnormality based on Eye-Tracking
    Uchida, Noriyuki
    Takatuka, Kayoko
    Yamaba, Hisaaki
    Mukunoki, Masayuki
    Okazaki, Naonobu
    [J]. JOURNAL OF ROBOTICS NETWORKING AND ARTIFICIAL LIFE, 2021, 8 (03): : 205 - 210
  • [10] Eye-tracking during newborn intubations
    Philipp K. Buehler
    Pedro David Wendel-Garcia
    Daniel A. Hofmaenner
    [J]. Pediatric Research, 2023, 94 : 418 - 419