Brain-Computer Interface Integrated With Augmented Reality for Human-Robot Interaction

被引:15
|
作者
Fang, Bin [1 ]
Ding, Wenlong [2 ]
Sun, Fuchun [1 ]
Shan, Jianhua [2 ]
Wang, Xiaojia [3 ]
Wang, Chengyin [2 ]
Zhang, Xinyu [4 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Anhui Univ Technol, Dept Mech Engn, Maanshan 243002, Anhui, Peoples R China
[3] Clemson Univ, Dept Elect & Comp Engn, Clemson, SC 29634 USA
[4] Tsinghua Univ, State Key Lab Automot Safety & Energy, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Augmented reality (AR); brain-computer interface (BCI) system; FB-tCNN; human-robot interaction; steady-state visual evoked potential (SSVEP); stimulation interface; visual information; COMMUNICATION;
D O I
10.1109/TCDS.2022.3194603
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-computer interface (BCI) has been gradually used in human-robot interaction systems. Steady-state visual evoked potential (SSVEP) as a paradigm of electroencephalography (EEG) has attracted more attention in the BCI system research due to its stability and efficiency. However, an independent monitor is needed in the traditional SSVEP-BCI system to display stimulus targets, and the stimulus targets map fixedly to some preset commands. These limit the development of the SSVEP-BCI application system in complex and changeable scenarios. In this study, the SSVEP-BCI system integrated with augmented reality (AR) is proposed. Furthermore, a stimulation interface is made by merging the visual information of the objects with stimulus targets, which can update the mapping relationship between stimulus targets and objects automatically to adapt to the change of the objects in the workspace. During the online experiment of the AR-based SSVEP-BCI cue-guided task with the robotic arm, the success rate of grasping is 87.50 +/- 3.10% with the SSVEP-EEG data recognition time of 0.5 s based on FB-tCNN. The proposed AR-based SSVEP-BCI system enables the users to select intention targets more ecologically and can grasp more kinds of different objects with a limited number of stimulus targets, resulting in the potential to be used in complex and changeable scenarios.
引用
下载
收藏
页码:1702 / 1711
页数:10
相关论文
共 50 条
  • [41] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Phillips, Elizabeth
    HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 671 - 672
  • [42] A hand-interaction model for augmented reality enhanced human-robot collaboration
    Blankemeyer, Sebastian
    Wendorff, David
    Raatz, Annika
    CIRP ANNALS-MANUFACTURING TECHNOLOGY, 2024, 73 (01) : 17 - 20
  • [43] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Rosen, Eric
    Groechel, Thomas
    Walker, Michael E.
    Chang, Christine T.
    Forde, Jessica Zosa
    HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, : 721 - 723
  • [44] Human-Robot Collaboration: An Augmented Reality Toolkit for Bi-Directional Interaction
    Carriero, Graziano
    Calzone, Nicolas
    Sileo, Monica
    Pierri, Francesco
    Caccavale, Fabrizio
    Mozzillo, Rocco
    APPLIED SCIENCES-BASEL, 2023, 13 (20):
  • [45] Brain-Computer Interface for Virtual Reality Control
    Guger, Christoph
    Groenegress, Chris
    Holzner, Clemens
    Edlinger, Guenter
    Slater, Mel
    CYBERPSYCHOLOGY & BEHAVIOR, 2009, 12 (01): : 84 - 84
  • [46] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Wozniak, Maciej K.
    Pascher, Max
    Ikeda, Bryce
    Luebbers, Matthew B.
    Jena, Ayesha
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1361 - 1363
  • [47] Editorial: Brain-Computer Interfaces and Augmented/Virtual Reality
    Putze, Felix
    Vourvopoulos, Athanasios
    Lecuyer, Anatole
    Krusienski, Dean
    Bermudez i Badia, Sergi
    Mullen, Timothy
    Herff, Christian
    FRONTIERS IN HUMAN NEUROSCIENCE, 2020, 14
  • [48] Industrial Human-Robot Interaction: Creating Personas for Augmented Reality supported Robot Control and Teaching
    Stadler, Susanne
    Mirnig, Nicole
    Giuliani, Manuel
    Tscheligi, Manfred
    Materna, Zdenek
    Kapinus, Michal
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 291 - 292
  • [50] A Human-Robot Interaction Applicution Based on Augmented Reality (AR) for Industrial Robot Grasping Process
    Zhao, Lizhong
    Hu, Zheming
    Ding, Huaqiu
    Ji, Siyang
    Yan, Jihong
    2022 7TH INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION ENGINEERING, ICRAE, 2022, : 312 - 316