A Communication Theory Perspective on Prompting Engineering Methods for Large Language Models

被引:0
|
作者
Song, Yuan-Feng [1 ]
He, Yuan-Qin [1 ]
Zhao, Xue-Fang [1 ]
Gu, Han-Lin [1 ]
Jiang, Di [1 ]
Yang, Hai-Jun [1 ]
Fan, Li-Xin [1 ]
机构
[1] AI Group, WeBank Co., Ltd, Shenzhen,518000, China
关键词
Contrastive Learning - Modeling languages - Natural language processing systems - Self-supervised learning - Semi-supervised learning - Zero-shot learning;
D O I
10.1007/s11390-024-4058-8
中图分类号
学科分类号
摘要
The springing up of large language models (LLMs) has shifted the community from single-task-orientated natural language processing (NLP) research to a holistic end-to-end multi-task learning paradigm. Along this line of research endeavors in the area, LLM-based prompting methods have attracted much attention, partially due to the technological advantages brought by prompt engineering (PE) as well as the underlying NLP principles disclosed by various prompting methods. Traditional supervised learning usually requires training a model based on labeled data and then making predictions. In contrast, PE methods directly use the powerful capabilities of existing LLMs (e.g., GPT-3 and GPT-4) via composing appropriate prompts, especially under few-shot or zero-shot scenarios. Facing the abundance of studies related to the prompting and the ever-evolving nature of this field, this article aims to 1) illustrate a novel perspective to review existing PE methods within the well-established communication theory framework, 2) facilitate a better/deeper understanding of developing trends of existing PE methods used in three typical tasks, and 3) shed light on promising research directions for future PE methods. © Institute of Computing Technology, Chinese Academy of Sciences 2024.
引用
收藏
页码:984 / 1004
页数:20
相关论文
共 50 条
  • [41] Large Language Models: A Historical and Sociocultural Perspective
    Ji, Eugene Yu
    COGNITIVE SCIENCE, 2024, 48 (03)
  • [42] Ontologies in the era of large language models - a perspective
    Neuhaus, Fabian
    APPLIED ONTOLOGY, 2023, 18 (04) : 399 - 407
  • [43] LLMR: Real-time Prompting of Interactive Worlds using Large Language Models
    De la Torre, Fernanda
    Fang, Cathy Mengying
    Huang, Han
    Banburski-Fahey, Andrzej
    Fernandez, Judith Amores
    Lanier, Jaron
    PROCEEDINGS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYTEMS (CHI 2024), 2024,
  • [44] Prompting large language models for user simulation in task-oriented dialogue systems
    Algherairy, Atheer
    Ahmed, Moataz
    COMPUTER SPEECH AND LANGUAGE, 2025, 89
  • [45] Chain-of-event prompting for multi-document summarization by large language models
    Bao, Songlin
    Li, Tiantian
    Cao, Bin
    INTERNATIONAL JOURNAL OF WEB INFORMATION SYSTEMS, 2024, 20 (03) : 229 - 247
  • [46] Impact of Contradicting Subtle Emotion Cues on Large Language Models with Various Prompting Techniques
    Huda, Noor Ul
    Sahito, Sanam Fayaz
    Gilal, Abdul Rehman
    Abro, Ahsanullah
    Alshanqiti, Abdullah
    Alsughayyir, Aeshah
    Palli, Abdul Sattar
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (04) : 407 - 414
  • [47] PEARL: Prompting Large Language Models to Plan and Execute Actions Over Long Documents
    Sun, Simeng
    Liu, Yang
    Wang, Shuohang
    Iter, Dan
    Zhu, Chenguang
    Iyyer, Mohit
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 469 - 486
  • [48] Prompting or Fine-tuning? A Comparative Study of Large Language Models for Taxonomy Construction
    Chen, Boqi
    Yi, Fandi
    Varro, Daniel
    2023 ACM/IEEE INTERNATIONAL CONFERENCE ON MODEL DRIVEN ENGINEERING LANGUAGES AND SYSTEMS COMPANION, MODELS-C, 2023, : 588 - 596
  • [49] Distractor Generation for Multiple-Choice Questions with Predictive Prompting and Large Language Models
    Bitew, Semere Kiros
    Deleu, Johannes
    Develder, Chris
    Demeester, Thomas
    MACHINE LEARNING AND PRINCIPLES AND PRACTICE OF KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2023, PT II, 2025, 2134 : 48 - 63
  • [50] Large Language Models and the Future of Organization Theory
    Cornelissen, Joep
    Hollerer, Markus A.
    Boxenbaum, Eva
    Faraj, Samer
    Gehman, Joel
    ORGANIZATION THEORY, 2024, 5 (01):