Deep Automation Bias: How to Tackle a Wicked Problem of AI?

被引:18
|
作者
Strauss, Stefan [1 ]
机构
[1] Austrian Acad Sci, Inst Technol Assessment ITA, A-1030 Vienna, Austria
关键词
artificial intelligence; machine learning; automation bias; fairness; transparency; accountability; explicability; uncertainty; human-in-the-loop; awareness raising;
D O I
10.3390/bdcc5020018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The increasing use of AI in different societal contexts intensified the debate on risks, ethical problems and bias. Accordingly, promising research activities focus on debiasing to strengthen fairness, accountability and transparency in machine learning. There is, though, a tendency to fix societal and ethical issues with technical solutions that may cause additional, wicked problems. Alternative analytical approaches are thus needed to avoid this and to comprehend how societal and ethical issues occur in AI systems. Despite various forms of bias, ultimately, risks result from eventual rule conflicts between the AI system behavior due to feature complexity and user practices with limited options for scrutiny. Hence, although different forms of bias can occur, automation is their common ground. The paper highlights the role of automation and explains why deep automation bias (DAB) is a metarisk of AI. Based on former work it elaborates the main influencing factors and develops a heuristic model for assessing DAB-related risks in AI systems. This model aims at raising problem awareness and training on the sociotechnical risks resulting from AI-based automation and contributes to improving the general explicability of AI systems beyond technical issues.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] 10.E. Workshop: Childhood obesity: how to tackle this wicked problem?
    Verhoeff, Arnoud
    [J]. EUROPEAN JOURNAL OF PUBLIC HEALTH, 2020, 30
  • [2] Automation Bias in Breast AI
    Baltzer, Pascal A. T.
    [J]. RADIOLOGY, 2023, 307 (04)
  • [3] HOW TO TACKLE A PROBLEM AND SOLVE IT
    POLLOCK, T
    [J]. METAL STAMPING, 1969, 3 (08): : 25 - &
  • [4] Check the box! How to deal with automation bias in AI-based personnel selection
    Kupfer, Cordula
    Prassl, Rita
    Fleiss, Juergen
    Malin, Christine
    Thalmann, Stefan
    Kubicek, Bettina
    [J]. FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [5] How do people react to AI failure? Automation bias, algorithmic aversion, and perceived controllability
    Jones-Jang, S. Mo
    Park, Yong Jin
    [J]. JOURNAL OF COMPUTER-MEDIATED COMMUNICATION, 2022, 28 (01)
  • [6] How deep the bias
    Jagsi, Reshma
    [J]. JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2008, 299 (03): : 259 - 260
  • [7] HOW AI WILL ADD BRAINS TO OFFICE AUTOMATION
    WOLFE, A
    [J]. ELECTRONICS, 1986, 59 (34): : 63 - 66
  • [8] How to tackle my dry skin problem?
    Lo, K. K.
    [J]. HONG KONG JOURNAL OF DERMATOLOGY & VENEREOLOGY, 2011, 19 (04): : 183 - 185
  • [9] The Happiness of the Wicked: How Tokugawa Thinkers Dealt with the Problem
    Ansart, Olivier
    [J]. ASIAN PHILOSOPHY, 2012, 22 (02) : 161 - 175
  • [10] How AI for Synthesis Can Help Tackle Challenges in Molecular Discovery
    Thakkar, Amol
    Schwaller, Philippe
    [J]. CHIMIA, 2021, 75 (7-8) : 677 - 678