Brain-Computer Interface Integrated With Augmented Reality for Human-Robot Interaction

被引:15
|
作者
Fang, Bin [1 ]
Ding, Wenlong [2 ]
Sun, Fuchun [1 ]
Shan, Jianhua [2 ]
Wang, Xiaojia [3 ]
Wang, Chengyin [2 ]
Zhang, Xinyu [4 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Anhui Univ Technol, Dept Mech Engn, Maanshan 243002, Anhui, Peoples R China
[3] Clemson Univ, Dept Elect & Comp Engn, Clemson, SC 29634 USA
[4] Tsinghua Univ, State Key Lab Automot Safety & Energy, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Augmented reality (AR); brain-computer interface (BCI) system; FB-tCNN; human-robot interaction; steady-state visual evoked potential (SSVEP); stimulation interface; visual information; COMMUNICATION;
D O I
10.1109/TCDS.2022.3194603
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-computer interface (BCI) has been gradually used in human-robot interaction systems. Steady-state visual evoked potential (SSVEP) as a paradigm of electroencephalography (EEG) has attracted more attention in the BCI system research due to its stability and efficiency. However, an independent monitor is needed in the traditional SSVEP-BCI system to display stimulus targets, and the stimulus targets map fixedly to some preset commands. These limit the development of the SSVEP-BCI application system in complex and changeable scenarios. In this study, the SSVEP-BCI system integrated with augmented reality (AR) is proposed. Furthermore, a stimulation interface is made by merging the visual information of the objects with stimulus targets, which can update the mapping relationship between stimulus targets and objects automatically to adapt to the change of the objects in the workspace. During the online experiment of the AR-based SSVEP-BCI cue-guided task with the robotic arm, the success rate of grasping is 87.50 +/- 3.10% with the SSVEP-EEG data recognition time of 0.5 s based on FB-tCNN. The proposed AR-based SSVEP-BCI system enables the users to select intention targets more ecologically and can grasp more kinds of different objects with a limited number of stimulus targets, resulting in the potential to be used in complex and changeable scenarios.
引用
下载
收藏
页码:1702 / 1711
页数:10
相关论文
共 50 条
  • [31] BARI: An Affordable Brain-Augmented Reality Interface to Support Human-Robot Collaboration in Assembly Tasks
    Sanna, Andrea
    Manuri, Federico
    Fiorenza, Jacopo
    De Pace, Francesco
    INFORMATION, 2022, 13 (10)
  • [32] Reliable Planning and Execution of a Human-Robot Cooperative System Based on Noninvasive Brain-Computer Interface with Uncertainty
    Jia, Wenchuan
    Huang, Dandan
    Bai, Ou
    Pu, Huayan
    Luo, Xin
    Chen, Xuedong
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 3798 - 3805
  • [33] Robotic arm control system based on augmented reality brain-computer interface and computer vision
    Chen X.
    Li K.
    Shengwu Yixue Gongchengxue Zazhi/Journal of Biomedical Engineering, 2021, 38 (03): : 483 - 491
  • [34] An Affective Interaction System using Virtual Reality and Brain-Computer Interface
    Chin, Zheng Yang
    Zhang, Zhuo
    Wang, Chuanchu
    Ang, Kai Keng
    2021 43RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY (EMBC), 2021, : 6183 - 6186
  • [35] Augmented Reality in Human-Robot Cooperative Search
    Lee, Kevin
    Reardon, Christopher
    Fink, Jonathan
    2018 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2018,
  • [36] A Survey of Augmented Reality for Human-Robot Collaboration
    Chang, Christine T.
    Hayes, Bradley
    MACHINES, 2024, 12 (08)
  • [37] A Brain-Computer Interface and Augmented Reality Neurofeedback to Treat ADHD: A Virtual Telekinesis Approach
    Reddy, G. S. Rajshekar
    Lingaraju, G. M.
    ADJUNCT PROCEEDINGS OF THE 2020 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR-ADJUNCT 2020), 2020, : 123 - 128
  • [38] A Robotic Teleoperation System Enhanced by Augmented Reality for Natural Human-Robot Interaction
    Wang, Xingchao
    Guo, Shuqi
    Xu, Zijian
    Zhang, Zheyuan
    Sun, Zhenglong
    Xu, Yangsheng
    CYBORG AND BIONIC SYSTEMS, 2024, 5 : 1 - 12
  • [39] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Wozniak, Maciej
    Chang, Christine T.
    Luebbers, Matthew B.
    Ikeda, Bryce
    Walker, Michael
    Rosen, Eric
    Groechel, Thomas Roy
    COMPANION OF THE ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2023, 2023, : 938 - 940
  • [40] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Khim, Ong Soh
    Rosen, Eric
    Booth, Serena
    Groechel, Thomas
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 663 - 664