Exploring the Effects of Perceptual Separability on Human-Automation Team Efficiency

被引:0
|
作者
Scott-Sharoni S.T. [1 ]
Yamani Y. [1 ]
Kneeland C.M. [2 ]
Long S.K. [1 ]
Chen J. [1 ]
Houpt J.W. [3 ]
机构
[1] Department of Psychology, Old Dominion University, Norfolk, VA
[2] Wright State University, Dayton, OH
[3] University of Texas at San Antonio, San Antonio, TX
基金
美国国家科学基金会;
关键词
Display design; Human-automation interaction; Perceptual separability; Systems factorial technology;
D O I
10.1007/s42113-021-00108-z
中图分类号
学科分类号
摘要
The purpose of the current experiment was to examine the effect of perceptual separability on human-automation team efficiency in a speeded judgment task. Human operators in applied environments interact with automated systems via a visual display which contain both complex raw data and automated support, requiring both sources of information to be mentally integrated by operators. Participants performed a speeded length-judgment task with or without decisional cues issued by a reliable automated aid. The cue was rendered in the format perceptually separable (color) or configural (area) to raw stimulus information (length). Workload capacity measures quantified human-automation team efficiency. Participants responded more slowly following the onset of the aid’s decisional cue in the area display format in the form of limited-capacity processing than the color display format, which led to unlimited-capacity processing. The color display format can support unlimited-capacity processing without moderating operators’ response speed while the area display format may produce limited-capacity processing, delaying their responses. Automation and display designers should consider utilizing separable perceptual characteristics of display elements in visual interfaces to improve human-automation team efficiency in a speeded perceptual-cognitive task. © 2021, Society for Mathematical Psychology.
引用
收藏
页码:486 / 496
页数:10
相关论文
共 50 条
  • [21] The role of social support in human-automation interaction
    Sauer, Juergen
    Sonderegger, Andreas
    Semmer, Norbert K.
    ERGONOMICS, 2024, 67 (06) : 732 - 743
  • [22] Intuitive Cognition and Models of Human-Automation Interaction
    Patterson, Robert Earl
    HUMAN FACTORS, 2017, 59 (01) : 101 - 115
  • [23] Factors Affecting Performance of Human-Automation Teams
    Baker, Anthony L.
    Keebler, Joseph R.
    ADVANCES IN HUMAN FACTORS IN ROBOTS AND UNMANNED SYSTEMS, 2017, 499 : 331 - 340
  • [24] Cognitive Ergonomics for Situated Human-Automation Collaboration
    Brinkman, Willem-Paul
    Neerincx, Mark A.
    van Oostendorp, Herre
    INTERACTING WITH COMPUTERS, 2011, 23 (04) : III - IV
  • [25] The Role of Simulation in Designing Human-Automation Systems
    Rusnock, Christina F.
    Boubin, Jayson G.
    Giametta, Joseph J.
    Goodman, Tyler J.
    Hillesheim, Anthony J.
    Kim, Sungbin
    Meyer, David R.
    Watson, Michael E.
    FOUNDATIONS OF AUGMENTED COGNITION: NEUROERGONOMICS AND OPERATIONAL NEUROSCIENCE, PT II, 2016, 9744 : 361 - 370
  • [26] Effects of Augmented Feedback and Motor Learning Adaptation on Human-Automation Interaction Factors
    O'Meara, Sarah M.
    Karasinski, John A.
    Miller, Casey L.
    Joshi, Sanjay S.
    Robinson, Stephen K.
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2021, 18 (06): : 377 - 390
  • [27] Work domain modeling of human-automation interaction for in-vehicle automation
    Zhang, You
    Lintern, Gavan
    COGNITION TECHNOLOGY & WORK, 2024, 26 (04) : 585 - 601
  • [28] Effects of task allocation on air traffic management human-automation system performance
    Martin, Lynne
    Bienert, Nancy
    Claudatos, Lauren
    Gujral, Vimmy
    Kraut, Joshua
    Mercer, Joey
    2016 IEEE/AIAA 35TH DIGITAL AVIONICS SYSTEMS CONFERENCE (DASC), 2016,
  • [29] A human-automation interface model to guide automation design of system functions
    Kennedy, Joshua S.
    McCauley, Michael E.
    Naval Engineers Journal, 2007, 119 (01): : 109 - 124
  • [30] Blame Attribution Asymmetry in Human-Automation Cooperation
    Liu, Peng
    Du, Yong
    RISK ANALYSIS, 2022, 42 (08) : 1769 - 1783