THE JUDICIAL DEMAND FOR EXPLAINABLE ARTIFICIAL INTELLIGENCE

被引:5
|
作者
Deeks, Ashley [1 ]
机构
[1] Univ Virginia, Law, Law Sch, Charlottesville, VA 22903 USA
关键词
COMMON-LAW; DECISION-MAKING; EXPLANATION; AUTOMATION; BIAS;
D O I
暂无
中图分类号
D9 [法律]; DF [法律];
学科分类号
0301 ;
摘要
A recurrent concern about machine learning algorithms is that they operate as "black boxes," making it difficult to identify how and why the algorithms reach particular decisions, recommendations, or predictions. Yet judges are confronting machine learning algorithms with increasing frequency, including in criminal, administrative, and civil cases. This Essay argues that judges should demand explanations for these algorithmic outcomes. One way to address the "black box" problem is to design systems that explain how the algorithms reach their conclusions or predic lions. If and as judges demand these explanations, they will play a seminal role in shaping the nature and form of "explainable AI" (xAI). Using the tools of the common law, courts can develop what xAI should mean in different legal contexts. There are advantages to having courts to play this role: Judicial reasoning that builds from the bottom up, using case-by-case consideration of the facts to produce nuanced decisions, is a pragmatic way to develop rules for xAI. Further, courts are likely to stimulate the production of different forms of xAI that are responsive to distinct legal settings and audiences. More generally, we should favor the greater involvement of public actors in shaping xAI, which to date has largely been left in private hands.
引用
下载
收藏
页码:1829 / 1850
页数:22
相关论文
共 50 条
  • [1] Explainable artificial intelligence
    Wickramasinghe, Chathurika S.
    Marino, Daniel
    Amarasinghe, Kasun
    FRONTIERS IN COMPUTER SCIENCE, 2023, 5
  • [2] Analysis of driving factors of water demand based on explainable artificial intelligence
    Ou, Zhigang
    He, Fan
    Zhu, Yongnan
    Lu, Peiyi
    Wang, Lichuan
    JOURNAL OF HYDROLOGY-REGIONAL STUDIES, 2023, 47
  • [3] Prediction of bike-sharing station demand using explainable artificial intelligence
    Ngeni, Frank
    Kutela, Boniphace
    Chengula, Tumlumbe Juliana
    Ruseruka, Cuthbert
    Musau, Hannah
    Novat, Norris
    Indah, Debbie Aisiana
    Kasomi, Sarah
    MACHINE LEARNING WITH APPLICATIONS, 2024, 17
  • [4] Explainable artificial intelligence for reliable water demand forecasting to increase trust in predictions
    Maußner, Claudia
    Oberascher, Martin
    Autengruber, Arnold
    Kahl, Arno
    Sitzenfrei, Robert
    Water Research, 2025, 268
  • [5] Explainable Artificial Intelligence for Kids
    Alonso, Jose M.
    PROCEEDINGS OF THE 11TH CONFERENCE OF THE EUROPEAN SOCIETY FOR FUZZY LOGIC AND TECHNOLOGY (EUSFLAT 2019), 2019, 1 : 134 - 141
  • [6] Explainable and Trustworthy Artificial Intelligence
    Alonso-Moral, Jose Maria
    Mencar, Corrado
    Ishibuchi, Hisao
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2022, 17 (01) : 14 - 15
  • [7] Review of Explainable Artificial Intelligence
    Zhao, Yanyu
    Zhao, Xiaoyong
    Wang, Lei
    Wang, Ningning
    Computer Engineering and Applications, 2023, 59 (14) : 1 - 14
  • [8] Explainable artificial intelligence in pathology
    Klauschen, Frederick
    Dippel, Jonas
    Keyl, Philipp
    Jurmeister, Philipp
    Bockmayr, Michael
    Mock, Andreas
    Buchstab, Oliver
    Alber, Maximilian
    Ruff, Lukas
    Montavon, Gregoire
    Mueller, Klaus-Robert
    PATHOLOGIE, 2024, : 133 - 139
  • [9] Explainable and responsible artificial intelligence
    Christian Meske
    Babak Abedin
    Mathias Klier
    Fethi Rabhi
    Electronic Markets, 2022, 32 : 2103 - 2106
  • [10] Explainable Artificial Intelligence in education
    Khosravi H.
    Shum S.B.
    Chen G.
    Conati C.
    Tsai Y.-S.
    Kay J.
    Knight S.
    Martinez-Maldonado R.
    Sadiq S.
    Gašević D.
    Computers and Education: Artificial Intelligence, 2022, 3