The effect of gender stereotypes on artificial intelligence recommendations

被引:81
|
作者
Ahn, Jungyong [1 ]
Kim, Jungwon [2 ]
Sung, Yongjun [2 ]
机构
[1] Korea Univ, Sch Media & Commun, Seoul, South Korea
[2] Korea Univ, Sch Psychol, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Artificial Intelligence (AI); AI agent; Gender stereotypes; AI recommendations; UNIVERSAL DIMENSIONS; SOCIAL-PERCEPTION; BRAND TRUST; WARMTH; COMPETENCE; EXPECTATIONS; PERSONALITY; ACTIVATION; COMPUTERS; WORKPLACE;
D O I
10.1016/j.jbusres.2021.12.007
中图分类号
F [经济];
学科分类号
02 ;
摘要
This study explores the effects of gender stereotypes on evaluating artificial intelligence (AI) recommendations. We predict that gender stereotypes will affect human-AI interactions, resulting in somewhat different persuasive effects of AI recommendations for utilitarian vs. hedonic products. We found that participants in the male AI agent condition gave higher competence scores than in the female AI agent condition. Contrariwise, perceived warmth was higher in the female AI agent condition than in the male condition. More importantly, a significant interaction effect between AI gender and product type was found, suggesting that participants showed more positive attitudes toward the AI recommendations when the male AI recommended a utilitarian (vs. hedonic) product. Conversely, a hedonic product was evaluated more positively when advised by the female (vs. male) AI agent.
引用
收藏
页码:50 / 59
页数:10
相关论文
共 50 条
  • [1] Artificial intelligence reinforces stereotypes
    Spjeldreaes, Amanda
    TIDSSKRIFT FOR DEN NORSKE LAEGEFORENING, 2024, 144 (05)
  • [2] Gender stereotypes in artificial intelligence within the accounting profession using large language models
    Leong, Kelvin
    Sung, Anna
    HUMANITIES & SOCIAL SCIENCES COMMUNICATIONS, 2024, 11 (01):
  • [3] Exposing implicit biases and stereotypes in human and artificial intelligence: state of the art and challenges with a focus on gender
    Marinucci, Ludovica
    Mazzuca, Claudia
    Gangemi, Aldo
    AI & SOCIETY, 2023, 38 (02) : 747 - 761
  • [4] Exposing implicit biases and stereotypes in human and artificial intelligence: state of the art and challenges with a focus on gender
    Ludovica Marinucci
    Claudia Mazzuca
    Aldo Gangemi
    AI & SOCIETY, 2023, 38 : 747 - 761
  • [5] Artificial intelligence knowledge of evidence-based recommendations in gender affirmation surgery and gender identity: is ChatGPT aware of WPATH recommendations?
    Daniel Najafali
    Chandler Hinson
    Justin M. Camacho
    Logan G. Galbraith
    Tannon L. Tople
    Danielle Eble
    Brielle Weinstein
    Loren S. Schechter
    Amir H. Dorafshar
    Shane D. Morrison
    European Journal of Plastic Surgery, 2023, 46 : 1169 - 1176
  • [6] Artificial intelligence knowledge of evidence-based recommendations in gender affirmation surgery and gender identity: is ChatGPT aware of WPATH recommendations?
    Najafali, Daniel
    Hinson, Chandler
    Camacho, Justin M.
    Galbraith, Logan G.
    Tople, Tannon L.
    Eble, Danielle
    Weinstein, Brielle
    Schechter, Loren S.
    Dorafshar, Amir H.
    Morrison, Shane D.
    EUROPEAN JOURNAL OF PLASTIC SURGERY, 2023, 46 (06) : 1169 - 1176
  • [7] Gender biases in artificial intelligence
    de Zarate Alcarazo, Lucia Ortiz
    REVISTA DE OCCIDENTE, 2023, (502) : 5 - 20
  • [8] From alchemy to artificial intelligence: stereotypes of the scientist in Western literature
    Haynes, R
    PUBLIC UNDERSTANDING OF SCIENCE, 2003, 12 (03) : 243 - 253
  • [9] Beyond the stereotypes: Artificial Intelligence image generation and diversity in anesthesiology
    Gisselbaek, Mia
    Minsart, Laurens
    Koselerli, Ekin
    Suppan, Melanie
    Meco, Basak Ceyda
    Seidel, Laurence
    Albert, Adelin
    Barreto Chang, Odmara L.
    Saxena, Sarah
    Berger-Estilita, Joana
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2024, 7
  • [10] Data Science and Artificial Intelligence for Responsible Recommendations
    Wang, Shoujin
    Liu, Ninghao
    Zhang, Xiuzhen
    Wang, Yan
    Ricci, Francesco
    Mobasher, Bamshad
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4904 - 4905