Is Attention All You Need? Toward a Conceptual Model for Social Awareness in Large Language Models

被引:0
|
作者
Voria, Gianmario [1 ]
Catolino, Gemma [1 ]
Palomba, Fabio [1 ]
机构
[1] Univ Salerno, Salerno, Italy
关键词
Social Awareness; Software Engineering for Artificial Intelligence; Large Language Models;
D O I
10.1145/3650105.3652294
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Large Language Models (LLMs) are revolutionizing the landscape of Artificial I ntelligence ( AI) due to recent t echnological breakthroughs. Their remarkable success in aiding various Software Engineering (SE) tasks through AI-powered tools and assistants has led to the integration of LLMs as active contributors within development teams, ushering in novel modes of communication and collaboration. However, great power comes with great responsibility: ensuring that these models meet fundamental ethical principles such as fairness is still an open challenge. In this light, our vision paper analyzes the existing body of knowledge to propose a conceptual model designed to frame ethical, social, and cultural considerations that researchers and practitioners should consider when defining, employing, and validating LLM-based approaches for software engineering tasks.
引用
收藏
页码:69 / 73
页数:5
相关论文
共 50 条
  • [41] Understanding the Effect of Model Compression on Social Bias in Large Language Models
    Goncalves, Gustavo
    Strubell, Emma
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 2663 - 2675
  • [42] CAN LARGE LANGUAGE MODELS GENERATE CONCEPTUAL HEALTH ECONOMIC MODELS?
    Chhatwal, J.
    Yildirim, I
    Balta, D.
    Ermis, T.
    Tenkin, S.
    Samur, S.
    Ayer, T.
    VALUE IN HEALTH, 2024, 27 (06) : S123 - S123
  • [43] Upstream Mitigation Is Not All You Need: Testing the Bias Transfer Hypothesis in Pre-Trained Language Models
    Steed, Ryan
    Panda, Swetasudha
    Kobren, Ari
    Wick, Michael
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 3524 - 3542
  • [44] Text Is All You Need: Learning Language Representations for Sequential Recommendation
    Li, Jiacheng
    Wang, Ming
    Li, Jin
    Fu, Jinmiao
    Shen, Xin
    Shang, Jingbo
    McAuley, Julian
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 1258 - 1267
  • [45] Fortify the Shortest Stave in Attention: Enhancing Context Awareness of Large Language Models for Effective Tool-Use
    Chen, Yuhan
    Lv, Ang
    Lin, Ting-En
    Chen, Changyu
    Wu, Yuchuan
    Huang, Fei
    Li, Yongbin
    Yan, Rui
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 11160 - 11174
  • [46] Rapid Speaker Adaptation for Conformer Transducer: Attention and Bias are All You Need
    Huang, Yan
    Ye, Guoli
    Li, Jinyu
    Gong, Yifan
    INTERSPEECH 2021, 2021, : 1309 - 1313
  • [47] Sparse attention is all you need for pre-training on tabular data
    Tokimasa Isomura
    Ryotaro Shimizu
    Masayuki Goto
    Neural Computing and Applications, 2025, 37 (3) : 1509 - 1522
  • [48] A Little Bit Attention Is All You Need for Person Re-Identification
    Eisenbach, Markus
    Luebberstedt, Jannik
    Aganian, Dustin
    Gross, Horst-Michael
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 7598 - 7605
  • [49] Attention (to Virtuosity) Is All You Need: Religious Studies Pedagogy and Generative AI
    Barlow, Jonathan
    Holt, Lynn
    RELIGIONS, 2024, 15 (09)
  • [50] CROSS-ATTENTION WATERMARKING OF LARGE LANGUAGE MODELS
    Baldassini, Folco Bertini
    Huy H. Nguyen
    Chang, Ching-Chung
    Echizen, Isao
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 4625 - 4629