A multimodal teleoperation interface for human-robot collaboration

被引:4
|
作者
Si, Weiyong [1 ,2 ]
Zhong, Tianjian [3 ]
Wang, Ning [1 ,2 ]
Yang, Chenguang [1 ,2 ]
机构
[1] Univ West England, Fac Environm & Technol, Bristol BS16 1QY, Avon, England
[2] Univ West England, Bristol Robot Lab, Bristol BS16 1QY, Avon, England
[3] Bristol Robot Lab, Bristol BS8 1TH, Avon, England
关键词
Immersive teleoperation; Human-in-the-loop; Human-robot interface;
D O I
10.1109/ICM54990.2023.10102060
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Human-robot collaboration provides an effective approach to combine human intelligence and the autonomy of robots, which can improve the safety and efficiency of the robot. However, developing an intuitive and immersive human-robot interface with multimodal feedback for human-robot interaction and collaboration is still challenging. In this paper, we developed a multimodal-based human-robot interface to involve humans in the loop. The Unity-based virtual reality (VR) environment, including the virtual robot manipulator and its working environment, was developed to simulate the real working environment of robots. We integrated the digital twin mechanism with the VR environment development, which provides a corresponding model with the physical task. The virtual environment could visualize the visual and haptic feedback through the multimodal sensors in the robot, which provides an immersive and friendly teleoperating environment for human operators. We conduct user study experiments based on NASA Task Load Index, through a physical contact scanning task. The result shows that the proposed multimodal interface improved by 31.8% in terms of the cognitive and physical workload, comparing with the commercial teleportation device Touch X.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] Human-robot collaboration: A survey
    Bauer, Andrea
    Wollherr, Dirk
    Buss, Martin
    INTERNATIONAL JOURNAL OF HUMANOID ROBOTICS, 2008, 5 (01) : 47 - 66
  • [32] Explainability for Human-Robot Collaboration
    Yadollahi, Elmira
    Romeo, Marta
    Dogan, Fethiye Irmak
    Johal, Wafa
    De Graaf, Maartje
    Levy-Tzedek, Shelly
    Leite, Iolanda
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1364 - 1366
  • [33] Human-Robot Collaboration: A Survey
    Chandrasekaran, Balasubramaniyan
    Conrad, James M.
    IEEE SOUTHEASTCON 2015, 2015,
  • [34] Expectedness in Human-Robot Collaboration
    Shayganfar, Mahni
    Rich, Charles
    Sidner, Candace L.
    AAMAS'16: PROCEEDINGS OF THE 2016 INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS & MULTIAGENT SYSTEMS, 2016, : 1271 - 1272
  • [35] Safety in human-robot collaboration
    Hofbaur, M.
    Rathmair, M.
    ELEKTROTECHNIK UND INFORMATIONSTECHNIK, 2019, 136 (07): : 301 - 306
  • [36] Human modeling for human-robot collaboration
    Hiatt, Laura M.
    Narber, Cody
    Bekele, Esube
    Khemlani, Sangeet S.
    Trafton, J. Gregory
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2017, 36 (5-7): : 580 - 596
  • [37] Visual and Manual Control for Human-Robot Teleoperation
    Duenser, Andreas
    Lochner, Martin
    Engelke, Ulrich
    Rozado Fernandez, David
    IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2015, 35 (03) : 20 - 30
  • [38] eEVA as a Real-Time Multimodal Agent Human-Robot Interface
    Pena, P.
    Polceanu, M.
    Lisetti, C.
    Visser, U.
    ROBOT WORLD CUP XXII, ROBOCUP 2018, 2019, 11374 : 262 - 274
  • [39] Haptic interface with multimodal tactile sensing and feedback for human-robot interaction
    Kang, Mingyu
    Gang, Cheol-Gu
    Ryu, Sang-Kyu
    Kim, Hyeon-Ju
    Jeon, Da-Yeon
    Pyo, Soonjae
    MICRO AND NANO SYSTEMS LETTERS, 2024, 12 (01)
  • [40] A novel teleoperation paradigm for human-robot interaction
    Wang, M
    Liu, JNK
    2004 IEEE CONFERENCE ON ROBOTICS, AUTOMATION AND MECHATRONICS, VOLS 1 AND 2, 2004, : 13 - 18