High performance P300 spellers using GPT2 word prediction with cross-subject training

被引:0
|
作者
Parthasarathy, Nithin [1 ]
Soetedjo, James [2 ]
Panchavati, Saarang [3 ]
Parthasarathy, Nitya [1 ,4 ]
Arnold, Corey [3 ]
Pouratian, Nader [5 ]
Speier, William [3 ]
机构
[1] Univ Illinois, Dept Comp Sci, Champaign, IL 61820 USA
[2] Univ Washington, Dept Bioengn, Seattle, WA USA
[3] UCLA, Dept Radiol Sci, Los Angeles, CA USA
[4] PE Investments, Boston, MA USA
[5] Univ Texas Southwestern, Dept Neurol Surg, Dallas, TX USA
关键词
Amyotrophic lateral sclerosis (ALS); brain computer interface (BCI); P300; EEG; generative pre-trained transformer (GPT2); BRAIN-COMPUTER INTERFACE; LANGUAGE MODELS; INFORMATION;
D O I
10.1080/2326263X.2024.2413214
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Amyotrophic lateral sclerosis (ALS) severely impairs patients' ability to communicate, often leading to a decline in their quality of life within a few years of diagnosis. The P300 speller brain-computer interface (BCI) offers an alternative communication method by interpreting a subject's EEG response to flashing characters presented on a grid interface. This paper addresses the common speed limitations encountered in training efficient P300-based multi-subject classifiers by introducing innovative 'across-subject' classifiers. We leverage a combination of the second-generation Generative Pre-Trained Transformer (GPT2) and Dijkstra's algorithm to optimize stimuli and suggest word completion choices based on subjects' typing history. Additionally, we employ a multi-layered smoothing technique to accommodate out-of-vocabulary (OOV) words. Through extensive simulations employing random sampling of EEG data from multiple subjects, we demonstrate significant speed enhancements in typing passages containing rare and OOV words. These optimizations result in approximately $10\% $10% improvement in character-level typing speed and up to $40\% $40% improvement in multi-word prediction. We demonstrate that augmenting standard row/column highlighting techniques with layered word prediction yields close-to-optimal performance. Furthermore, we explore both 'within-subject' and 'across-subject' training techniques, showing that speed improvements are consistent across both approaches.
引用
收藏
页码:210 / 224
页数:15
相关论文
共 6 条
  • [1] Optimizing P300 speller performance using language models for character and word prediction
    Parthasarathy, Nithin
    Arnold, Corey
    Pouratian, Nader
    Speier, William
    2020 IEEE 20TH INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOENGINEERING (BIBE 2020), 2020, : 807 - 812
  • [2] A cross-subject decoding algorithm for patients with disorder of consciousness based on P300 brain computer interface
    Wang, Fei
    Wan, Yinxing
    Li, Zhuorong
    Qi, Feifei
    Li, Jingcong
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [3] Self-distillation with beta label smoothing-based cross-subject transfer learning for P300 classification
    Li, Shurui
    Zhao, Liming
    Liu, Chang
    Jin, Jing
    Guan, Cuntai
    PATTERN RECOGNITION, 2025, 159
  • [4] A Relevant Prototype Domain Gradient Projection Continual Learning Method for Cross-Subject P300 Brain-Computer Interfaces
    Wu, Zhicong
    Cai, Honghua
    Ling, Yuyan
    Pan, Jiahui
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT IV, ICIC 2024, 2024, 14865 : 398 - 411
  • [5] P300 BCI PERFORMANCE PREDICTION USING AN AUDITORY STANDARD ODDBALL
    Halder, Sebastian
    Hammer, Eva-Maria
    Kleih, Sonja C.
    Kuebler, Andrea
    PSYCHOPHYSIOLOGY, 2009, 46 : S76 - S76
  • [6] Subject-Independent Classification of P300 Event-Related Potentials Using a Small Number of Training Subjects
    Abibullaev, Berdakh
    Kunanbayev, Kassymzhomart
    Zollanvari, Amin
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2022, 52 (05) : 843 - 854