Level of Robot Autonomy and Information Aids in Human-Robot Interaction Affect Human Mental Workload - An Investigation in Virtual Reality

被引:8
|
作者
Kaufeld, Mara [1 ]
Nickel, Peter [2 ]
机构
[1] Fraunhofer Inst Commun Informat Proc & Ergon FKIE, Human Factors, Bonn, Germany
[2] German Social Accid Insurance IFA, Inst Occupat Safety & Hlth, Accid Prevent Prod Safety, St Augustin, Germany
关键词
Human-robot interaction; Work systems design; Occupational safety and health; Virtual reality; Mental workload; Human-automation interaction; Human-system interaction;
D O I
10.1007/978-3-030-22216-1_21
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In future work systems, humans may interact with scalable industrial robots. In a virtual reality simulation study, human mental workload effects were analyzed in human-robot interactions (HRI) with variations in design requirements regarding human factors and ergonomics (HFE) as well as occupational safety and health (OSH). Each of 20 participants performed his/her own task while interacting with two virtual robots in a manufacturing environment. Results on task performance indicated relative lower human mental workload when robots acted on lower level of robot autonomy (lower LORA) and the human operator was informed about upcoming HRI by multi-modal signaling (Information Aid 'on'). However, this pattern of workload reflected in performance measures, was not reflected in mental workload ratings. Hence, compensational adjustments in operator performance were assumed. It was concluded, that a combination of less autonomous robots and multi-modal feedback result in relatively less operator distraction from task performance and, thus, less impairment in operator workload. HFE and OSH may improve when HRI is audio-visually indicated and robot activities are adapted to human operator task requirements (low LORA). Therefore, results have the potential to inform future design of HRI regarding HFE and OSH at different workplaces in industry and services.
引用
收藏
页码:278 / 291
页数:14
相关论文
共 50 条
  • [41] Human-Robot Interaction
    Jia, Yunyi
    Zhang, Biao
    Li, Miao
    King, Brady
    Meghdari, Ali
    JOURNAL OF ROBOTICS, 2018, 2018
  • [42] Eye-Tracking in Physical Human-Robot Interaction: Mental Workload and Performance Prediction
    Upasani, Satyajit
    Srinivasan, Divya
    Zhu, Qi
    Du, Jing
    Leonessa, Alexander
    HUMAN FACTORS, 2024, 66 (08) : 2104 - 2119
  • [43] Human-robot interaction and robot control
    Sequeira, Joao
    Ribeiro, Maria Isabel
    ROBOT MOTION AND CONTROL: RECENT DEVELOPMENTS, 2006, 335 : 375 - 390
  • [44] Human-Robot Interaction
    Sidobre, Daniel
    Broquere, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    ADVANCED BIMANUAL MANIPULATION: RESULTS FROM THE DEXMART PROJECT, 2012, 80 : 123 - +
  • [45] Human-Robot Interaction
    Sethumadhavan, Arathi
    ERGONOMICS IN DESIGN, 2012, 20 (03) : 27 - +
  • [46] Human-robot interaction
    Murphy R.R.
    Nomura T.
    Billard A.
    Burke J.L.
    IEEE Robotics and Automation Magazine, 2010, 17 (02): : 85 - 89
  • [47] Human-robot interaction
    Kosuge, K
    Hirata, Y
    IEEE ROBIO 2004: PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS, 2004, : 8 - 11
  • [48] Human-robot interaction
    Sidobre, Daniel
    Broquère, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    Springer Tracts in Advanced Robotics, 2012, 80 (STAR): : 123 - 172
  • [49] Human-Robot Interaction
    Ivaldi, Serena
    Pateraki, Maria
    ERCIM NEWS, 2018, (114): : 6 - 7
  • [50] Multimodal Information Fusion for Human-Robot Interaction
    Luo, Ren C.
    Wu, Y. C.
    Lin, P. H.
    2015 IEEE 10TH JUBILEE INTERNATIONAL SYMPOSIUM ON APPLIED COMPUTATIONAL INTELLIGENCE AND INFORMATICS (SACI), 2015, : 535 - 540