Exposing implicit biases and stereotypes in human and artificial intelligence: state of the art and challenges with a focus on gender

被引:0
|
作者
Ludovica Marinucci
Claudia Mazzuca
Aldo Gangemi
机构
[1] National Research Council (CNR),Institute of Cognitive Sciences and Technologies (ISTC)
[2] Sapienza University of Rome,Department Department of Dynamic, Clinical Psychology and Health
[3] University of Bologna, Department of Classical and Italian Philology
来源
AI & SOCIETY | 2023年 / 38卷
关键词
Knowledge graph; Word embeddings; Implicit biases; Gender; Cognitive semantics;
D O I
暂无
中图分类号
学科分类号
摘要
Biases in cognition are ubiquitous. Social psychologists suggested biases and stereotypes serve a multifarious set of cognitive goals, while at the same time stressing their potential harmfulness. Recently, biases and stereotypes became the purview of heated debates in the machine learning community too. Researchers and developers are becoming increasingly aware of the fact that some biases, like gender and race biases, are entrenched in the algorithms some AI applications rely upon. Here, taking into account several existing approaches that address the problem of implicit biases and stereotypes, we propose that a strategy to cope with this phenomenon is to unmask those found in AI systems by understanding their cognitive dimension, rather than simply trying to correct algorithms. To this extent, we present a discussion bridging together findings from cognitive science and insights from machine learning that can be integrated in a state-of-the-art semantic network. Remarkably, this resource can be of assistance to scholars (e.g., cognitive and computer scientists) while at the same time contributing to refine AI regulations affecting social life. We show how only through a thorough understanding of the cognitive processes leading to biases, and through an interdisciplinary effort, we can make the best of AI technology.
引用
收藏
页码:747 / 761
页数:14
相关论文
共 50 条
  • [1] Exposing implicit biases and stereotypes in human and artificial intelligence: state of the art and challenges with a focus on gender
    Marinucci, Ludovica
    Mazzuca, Claudia
    Gangemi, Aldo
    AI & SOCIETY, 2023, 38 (02) : 747 - 761
  • [2] HUMAN INTELLIGENCE AND ARTIFICIAL INTELLIGENCE AND THE CHALLENGES OF BIASES IN AI ALGORITHMS
    Fernandes, Erika Ribeiro
    Graglia, Marcelo Augusto Vieira
    RISUS-JOURNAL ON INNOVATION AND SUSTAINABILITY, 2024, 15 (01):
  • [3] Gender biases in artificial intelligence
    de Zarate Alcarazo, Lucia Ortiz
    REVISTA DE OCCIDENTE, 2023, (502) : 5 - 20
  • [4] The effect of gender stereotypes on artificial intelligence recommendations
    Ahn, Jungyong
    Kim, Jungwon
    Sung, Yongjun
    JOURNAL OF BUSINESS RESEARCH, 2022, 141 : 50 - 59
  • [5] Math and language gender stereotypes: Age and gender differences in implicit biases and explicit beliefs
    Vuletich, Heidi A.
    Kurtz-Costes, Beth
    Cooley, Erin
    Payne, B. Keith
    PLOS ONE, 2020, 15 (09):
  • [6] Acceptance, initial trust formation, and human biases in artificial intelligence: Focus on clinicians
    Choudhury, Avishek
    Elkefi, Safa
    FRONTIERS IN DIGITAL HEALTH, 2022, 4
  • [7] Design of Antennas through Artificial Intelligence: State of the Art and Challenges
    Goudos, Sotirios K. K.
    Diamantoulakis, Panagiotis D. D.
    Matin, Mohammad A. A.
    Sarigiannidis, Panagiotis
    Wan, Shaohua
    Karagiannidis, George K. K.
    IEEE COMMUNICATIONS MAGAZINE, 2022, 60 (12) : 96 - 102
  • [8] When debugging encounters artificial intelligence: state of the art and open challenges
    Yi Song
    Xiaoyuan Xie
    Baowen Xu
    Science China Information Sciences, 2024, 67
  • [9] Change Detection Based on Artificial Intelligence: State-of-the-Art and Challenges
    Shi, Wenzhong
    Zhang, Min
    Zhang, Rui
    Chen, Shanxiong
    Zhan, Zhao
    REMOTE SENSING, 2020, 12 (10)
  • [10] Artificial Intelligence Aided Engineering Education: State of the Art, Potentials and Challenges
    Martin Nunez, Jose L.
    Diaz Lantada, Andres
    INTERNATIONAL JOURNAL OF ENGINEERING EDUCATION, 2020, 36 (06) : 1740 - 1751