"I'm Sorry, but I Can't Assist": Bias in Generative AI

被引:0
|
作者
Smith, Julie M. [1 ]
机构
[1] Inst Adv Comp Educ, Peoria, IL 61629 USA
基金
美国国家科学基金会;
关键词
equity; student advising; artificial intelligence; large language models; generative AI; racism;
D O I
10.1145/3653666.3656065
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Research Questions: (1) Is there a pattern of racial bias in student advising recommendations made by generative AI? (2) What safeguards can promote equity when using generative AI in high-stakes decision-making? Methodology: Using lists of names associated with various ethnic/racial groups, we asked ChatGPT and Claude AI for recommendations for colleges and majors for each student. Results: ChatGPT was more likely to recommend STEM majors to some student groups. ChatGPT did not show systematic bias in various metrics of school quality, but Claude AI did. There were also overall differences in the colleges recommended by Claude AI and ChatGPT. Implications: We provide cautions and recommendations for using generative AI in high-stakes tasks.
引用
收藏
页码:75 / 80
页数:6
相关论文
共 50 条