Mixed Reality-Based User Interaction Feedback for a Hand-Controlled Interface Targeted to Robot Teleoperation

被引:7
|
作者
Cancedda, Laura [1 ]
Cannavo, Alberto [1 ]
Garofalo, Giuseppe [1 ]
Lamberti, Fabrizio [1 ]
Montuschi, Paolo [1 ]
Paravati, Gianluca [1 ]
机构
[1] Politecn Torino, Dipartimento Automat & Informat, Corso Duca Abruzzi 24, I-10129 Turin, Italy
关键词
Human-robot interaction; Robot teleoperation; 3D user interface; Mixed reality; Visual feedback; Hand-based control;
D O I
10.1007/978-3-319-60928-7_38
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The continuous progress in the field of robotics and the diffusion of its related application scenarios in today's modern world makes human interaction and communication with robots an aspect of fundamental importance. The development of interfaces based on natural interaction paradigms is getting an increasingly captivating topic in Human-Robot Interaction (HRI), due to their intrinsic capabilities in providing ever more intuitive and effective control modalities. Teleoperation systems require to handle a non-negligible amount of information coming from on-board sensors as well as input devices, thus increasing the workload of remote users. This paper presents the design of a 3D User Interface (3DUI) for the control of teleoperated robotic platforms aimed at increasing the interaction efficiency. A hand gesture driven controller is used as input modality to naturally map the position and gestures of the user's hand to suitable commands for controlling the platform components. The designed interface leverages on mixed reality to provide a visual feedback to the control commands issued by the user. The visualization of the 3DUI is superimposed to the video stream provided by an on-board camera. A user study confirmed that the proposed solution is able to improve the interaction efficiency by significantly reducing the completion time for tasks assigned in a remote reach-and-pick scenario.
引用
收藏
页码:447 / 463
页数:17
相关论文
共 50 条
  • [21] An experimental study on mixed reality-based user interface for collaborative operation of high-precision process equipment
    Wang, Zhuo
    Li, Liang
    Liu, Ye
    Jiang, Yan
    Wang, Yang
    Dai, Yuwei
    [J]. INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2024, 132 (5-6): : 2333 - 2345
  • [22] An experimental study on mixed reality-based user interface for collaborative operation of high-precision process equipment
    Zhuo Wang
    Liang Li
    Ye Liu
    Yan Jiang
    Yang Wang
    Yuwei Dai
    [J]. The International Journal of Advanced Manufacturing Technology, 2024, 132 : 2443 - 2459
  • [23] Designing Affordances for Virtual Reality-Based Services with Natural User Interaction
    Miura, Takayuki
    Yoshii, Akihito
    Nakajima, Tatsuo
    [J]. Design, User Experience, and Usability: Technological Contexts, Pt III, 2016, 9748 : 266 - 277
  • [24] Mixed Reality-Based Teleoperation of Mobile Robotic Arm: System Apparatus and Experimental Case Study
    Jarecki, Annalisa
    Lee, Kiju
    [J]. 2024 21ST INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS, UR 2024, 2024, : 198 - 203
  • [25] Mixed-Reality-Enhanced Human-Robot Interaction with an Imitation-Based Mapping Approach for Intuitive Teleoperation of a Robotic Arm-Hand System
    Su, Yun-Peng
    Chen, Xiao-Qi
    Zhou, Tony
    Pretty, Christopher
    Chase, Geoffrey
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (09):
  • [26] Mixed reality human teleoperation with device-agnostic remote ultrasound: Communication and user interaction
    Black, David
    Nogami, Mika
    Salcudean, Septimiu
    [J]. COMPUTERS & GRAPHICS-UK, 2024, 118 : 184 - 193
  • [27] Hand-adaptive user interface: improved gestural interaction in virtual reality
    Lou, Xiaolong
    Li, Xiangdong A.
    Hansen, Preben
    Du, Peng
    [J]. VIRTUAL REALITY, 2021, 25 (02) : 367 - 382
  • [28] Mixed Reality as a Bidirectional Communication Interface for Human-Robot Interaction
    Rosen, Eric
    Whitney, David
    Fishman, Michael
    Ullman, Daniel
    Tellex, Stefanie
    [J]. 2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 11431 - 11438
  • [29] Hand-adaptive user interface: improved gestural interaction in virtual reality
    Xiaolong Lou
    Xiangdong A. Li
    Preben Hansen
    Peng Du
    [J]. Virtual Reality, 2021, 25 : 367 - 382
  • [30] The Cyber-Physical Control Room: A Mixed Reality Interface for Mobile Robot Teleoperation and Human-Robot Teaming
    Walker, Michael E.
    Gramopadhye, Maitrey
    Ikeda, Bryce
    Burns, Jack
    Szafr, Daniel
    [J]. PROCEEDINGS OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024, 2024, : 762 - 771