Transparency improves the accuracy of automation use, but automation confidence information does not

被引:1
|
作者
Tatasciore, Monica [1 ]
Strickland, Luke [1 ,2 ]
Loft, Shayne [1 ]
机构
[1] Univ Western Australia, 35 Stirling Highway, Perth, WA 6009, Australia
[2] Curtin Univ, Perth, Australia
来源
基金
澳大利亚研究理事会;
关键词
Automation and human cognition; Automation transparency; Automation reliability; Uninhabited vehicle control; Decision-support systems; Automation confidence; AGENT TRANSPARENCY; TRUST; COMPLACENCY; REPRESENTATION; ATTENTION;
D O I
10.1186/s41235-024-00599-x
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Increased automation transparency can improve the accuracy of automation use but can lead to increased bias towards agreeing with advice. Information about the automation's confidence in its advice may also increase the predictability of automation errors. We examined the effects of providing automation transparency, automation confidence information, and their potential interacting effect on the accuracy of automation use and other outcomes. An uninhabited vehicle (UV) management task was completed where participants selected the optimal UV to complete missions. Low or high automation transparency was provided, and participants agreed/disagreed with automated advice on each mission. We manipulated between participants whether automated advice was accompanied by confidence information. This information indicated on each trial whether automation was "somewhat" or "highly" confident in its advice. Higher transparency improved the accuracy of automation use, led to faster decisions, lower perceived workload, and increased trust and perceived usability. Providing participant automation confidence information, as compared with not, did not have an overall impact on any outcome variable and did not interact with transparency. Despite no benefit, participants who were provided confidence information did use it. For trials where lower compared to higher confidence information was presented, hit rates decreased, correct rejection rates increased, decision times slowed, and perceived workload increased, all suggestive of decreased reliance on automated advice. Such trial-by-trial shifts in automation use bias and other outcomes were not moderated by transparency. These findings can potentially inform the design of automated decision-support systems that are more understandable by humans in order to optimise human-automation interaction.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Influence of Culture, Transparency, Trust, and Degree of Automation on Automation Use
    Chien, Ship Yi
    Lewis, Michael
    Sycara, Katia
    Kumru, Asiye
    Liu, Jyi-Shane
    IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 2020, 50 (03) : 205 - 214
  • [2] Can increased automation transparency mitigate the effects of time pressure on automation use?
    Tatasciore, Monica
    Loft, Shayne
    APPLIED ERGONOMICS, 2024, 114
  • [3] AUTOMATION IMPROVES SPEED AND ACCURACY OF SOFTWARE TESTING
    WILLIAMS, T
    COMPUTER DESIGN, 1992, 31 (07): : 57 - &
  • [4] BLIND CONFIDENCE IN AUTOMATION
    不详
    RELATIONS INDUSTRIELLES-INDUSTRIAL RELATIONS, 1956, 11 (04): : 304 - 306
  • [5] Automation use and automation bias
    Mosier, KL
    Skitka, LJ
    PROCEEDINGS OF THE HUMAN FACTORS AND ERGONOMICS SOCIETY 43RD ANNUAL MEETING, VOLS 1 AND 2, 1999, : 344 - 348
  • [8] Automation Improves Safety
    Heintzel, Alexander, 1600, Springer Nature (119): : 7 - 8
  • [9] The Burden of Communication: Effects of Automation Support and Automation Transparency on Team Performance
    Dikmen, Murat
    Li, Yeti
    Ho, Geoffrey
    Farrell, Philip
    Cao, Shi
    Burns, Catherine
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 2227 - 2231
  • [10] The use of chatbots for information automation in Spanish media
    Herrero-Diz, Paula
    Varona-Aramburu, David
    PROFESIONAL DE LA INFORMACION, 2018, 27 (04): : 742 - 749