共 50 条
When, What, and how should generative artificial intelligence explain to Users?
被引:0
|作者:
Jang, Soobin
[1
]
Lee, Haeyoon
[2
]
Kim, Yujin
[3
]
Lee, Daeho
[1
,2
,3
]
Shin, Jungwoo
[4
,5
]
Nam, Jungwoo
[6
]
机构:
[1] Sungkyunkwan Univ, Dept Appl Artificial Intelligence, Seoul, South Korea
[2] Sungkyunkwan Univ, Dept Interact Sci, Seoul, South Korea
[3] Sungkyunkwan Univ, Sch Convergence, Seoul, South Korea
[4] Kyung Hee Univ, Dept Ind & Management Syst Engn, Yongin, South Korea
[5] Kyung Hee Univ, Dept Big Data Analyt, Seoul, South Korea
[6] Sungkyunkwan Univ, Dept Human Artificial Intelligence Interact, Seoul, South Korea
基金:
新加坡国家研究基金会;
关键词:
Generative AI;
Conversational user interface;
Explainable AI;
Conjoint analysis;
SERVICES;
D O I:
10.1016/j.tele.2024.102175
中图分类号:
G25 [图书馆学、图书馆事业];
G35 [情报学、情报工作];
学科分类号:
1205 ;
120501 ;
摘要:
With the commercialization of ChatGPT, , generative artificial intelligence (AI) has been applied almost everywhere in our lives. However, even though generative AI has become a daily technology that anyone can use, most non-majors need to know the process and reason for the results because it can be misused due to lack of sufficient knowledge and misunderstanding. Therefore, this study investigated users' preferences for when, what, and how generative AI should provide explanations about the process of generating and the reasoning behind the results, using conjoint method and mixed logit analysis. The results show that users are most sensitive to the timing of providing eXplainable AI (XAI), and that users want additional information only when they ask for explanations during the process of using generative AI. The results of this study will help shape the XAI design of future generative AI from a user perspective and improve usability.
引用
收藏
页数:14
相关论文