Brain-Computer Interface Integrated With Augmented Reality for Human-Robot Interaction

被引:15
|
作者
Fang, Bin [1 ]
Ding, Wenlong [2 ]
Sun, Fuchun [1 ]
Shan, Jianhua [2 ]
Wang, Xiaojia [3 ]
Wang, Chengyin [2 ]
Zhang, Xinyu [4 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Anhui Univ Technol, Dept Mech Engn, Maanshan 243002, Anhui, Peoples R China
[3] Clemson Univ, Dept Elect & Comp Engn, Clemson, SC 29634 USA
[4] Tsinghua Univ, State Key Lab Automot Safety & Energy, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Augmented reality (AR); brain-computer interface (BCI) system; FB-tCNN; human-robot interaction; steady-state visual evoked potential (SSVEP); stimulation interface; visual information; COMMUNICATION;
D O I
10.1109/TCDS.2022.3194603
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-computer interface (BCI) has been gradually used in human-robot interaction systems. Steady-state visual evoked potential (SSVEP) as a paradigm of electroencephalography (EEG) has attracted more attention in the BCI system research due to its stability and efficiency. However, an independent monitor is needed in the traditional SSVEP-BCI system to display stimulus targets, and the stimulus targets map fixedly to some preset commands. These limit the development of the SSVEP-BCI application system in complex and changeable scenarios. In this study, the SSVEP-BCI system integrated with augmented reality (AR) is proposed. Furthermore, a stimulation interface is made by merging the visual information of the objects with stimulus targets, which can update the mapping relationship between stimulus targets and objects automatically to adapt to the change of the objects in the workspace. During the online experiment of the AR-based SSVEP-BCI cue-guided task with the robotic arm, the success rate of grasping is 87.50 +/- 3.10% with the SSVEP-EEG data recognition time of 0.5 s based on FB-tCNN. The proposed AR-based SSVEP-BCI system enables the users to select intention targets more ecologically and can grasp more kinds of different objects with a limited number of stimulus targets, resulting in the potential to be used in complex and changeable scenarios.
引用
下载
收藏
页码:1702 / 1711
页数:10
相关论文
共 50 条
  • [21] An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion
    Li, Chunxu
    Fahmy, Ashraf
    Sienz, Johann
    SENSORS, 2019, 19 (20)
  • [22] Brain-Computer Interface and Hand-Guiding Control in a Human-Robot Collaborative Assembly Task
    Dmytriyev, Yevheniy
    Insero, Federico
    Carnevale, Marco
    Giberti, Hermes
    MACHINES, 2022, 10 (08)
  • [23] Transparent Robot Behavior Using Augmented Reality in Close Human-Robot Interaction
    Bolano, Gabriele
    Juelg, Christian
    Roennau, Arne
    Dillmann, Ruediger
    2019 28TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2019,
  • [24] Human-Robot Cooperation via Brain Computer Interface
    Foresi, Gabriele
    Freddi, Alessandro
    Iarlori, Sabrina
    Monteriu, Andrea
    Ortenzi, Davide
    Pagnotta, Daniele Proietti
    2017 IEEE 7TH INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - BERLIN (ICCE-BERLIN), 2017, : 1 - 2
  • [25] Brain-Computer Interface in Virtual Reality
    Abbasi-Asl, Reza
    Keshavarzi, Mohammad
    Chan, Dorian Yao
    2019 9TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER), 2019, : 1220 - 1224
  • [26] Mixed Reality as a Bidirectional Communication Interface for Human-Robot Interaction
    Rosen, Eric
    Whitney, David
    Fishman, Michael
    Ullman, Daniel
    Tellex, Stefanie
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 11431 - 11438
  • [27] Assisted Human-Robot Interaction for Industry Application Based Augmented Reality
    Fang, Haonan
    Wen, Jingqian
    Yang, XiaoNan
    Wang, Peng
    Li, Yinqian
    VIRTUAL, AUGMENTED AND MIXED REALITY: APPLICATIONS IN EDUCATION, AVIATION AND INDUSTRY, PT II, 2022, 13318 : 291 - 301
  • [28] A human-robot interaction system for navigation supervision based on augmented reality
    Nunez, P.
    Bandera, J. P.
    Perez-Lorenzo, J. M.
    Sandoval, F.
    CIRCUITS AND SYSTEMS FOR SIGNAL PROCESSING , INFORMATION AND COMMUNICATION TECHNOLOGIES, AND POWER SOURCES AND SYSTEMS, VOL 1 AND 2, PROCEEDINGS, 2006, : 441 - 444
  • [29] A Brain-Computer Interface for Robot Navigation
    Nawroj, Ahsan I.
    Wang, Siyuan
    Yu, Yih-Choung
    Gabel, Lisa
    2012 38TH ANNUAL NORTHEAST BIOENGINEERING CONFERENCE (NEBEC), 2012, : 15 - +
  • [30] Human-Computer Interface Issues in Controlling Virtual Reality With Brain-Computer Interface
    Friedman, Doron
    Leeb, Robert
    Pfurtscheller, Gert
    Slater, Mel
    HUMAN-COMPUTER INTERACTION, 2010, 25 (01): : 67 - 94