LLMLingua-2: Data Distillation for Efficient and Faithful Task-Agnostic Prompt Compression

被引:0
|
作者
Pan, Zhuoshi [1 ]
Wu, Qianhui [2 ]
Jiang, Huiqiang [2 ]
Xia, Menglin [2 ]
Luo, Xufang [2 ]
Zhang, Jue [2 ]
Lin, Qingwei [2 ]
Ruhle, Victor [2 ]
Yang, Yuqing [2 ]
Lin, Chin-Yew [2 ]
Zhao, H. Vicky [1 ]
Qiu, Lili [2 ]
Zhang, Dongmei [2 ]
机构
[1] Tsinghua Univ, Beijing, Peoples R China
[2] Microsoft Corp, Redmond, WA 98052 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper focuses on task-agnostic prompt compression for better generalizability and efficiency. Considering the redundancy in natural language, existing approaches compress prompts by removing tokens or lexical units according to their information entropy obtained from a causal language model such as LLaMa-7B. The challenge is that information entropy may be a suboptimal compression metric: (i) it only leverages unidirectional context and may fail to capture all essential information needed for prompt compression; (ii) it is not aligned with the prompt compression objective. To address these issues, we propose a data distillation procedure to derive knowledge from an LLM to compress prompts without losing crucial information, and meantime, introduce an extractive text compression dataset. We formulate prompt compression as a token classification problem to guarantee the faithfulness of the compressed prompt to the original one, and use a Transformer encoder as the base architecture to capture all essential information for prompt compression from the full bidirectional context. Our approach leads to lower latency by explicitly learning the compression objective with smaller models such as XLM-RoBERTa-large and mBERT. We evaluate our method on both in-domain and out-of-domain datasets, including Meeting-Bank, LongBench, ZeroScrolls, GSM8K, and BBH. Despite its small size, our model shows significant performance gains over strong baselines and demonstrates robust generalization ability across different LLMs. Additionally, our model is 3x-6x faster than existing prompt compression methods, while accelerating the end-to-end latency by 1.6x-2.9x with compression ratios of 2x-5x.1
引用
收藏
页码:963 / 981
页数:19
相关论文
共 28 条
  • [21] A Task-Efficient Gradient Guide Knowledge Distillation for Pre-train Language Model Compression
    Liu, Xu
    Su, Yila
    Wu, Nier
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 366 - 377
  • [22] Efficient compression of SARS-CoV-2 genome data using Nucleotide Archival Format
    Kryukov, Kirill
    Jin, Lihua
    Nakagawa, So
    PATTERNS, 2022, 3 (09):
  • [24] Efficient field-programmable gate array implementation of CCSDS 121.0-B-2 lossless data compression algorithm for image compression
    Kranitis, Nektarios
    Sideris, Ioannis
    Tsigkanos, Antonios
    Theodorou, Georgios
    Paschalis, Antonios
    Vitulli, Raffaele
    JOURNAL OF APPLIED REMOTE SENSING, 2015, 9
  • [25] Design approach of FPGA based efficient data compression technique (TS D2 NIS) in FC technology for data transmission
    Muntha, Krishna Chaithanya
    Ponnusamy, Manimaran
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (11) : 32321 - 32341
  • [26] An Efficient Lossless Telemetry Data Compression and Fault Analysis System Using 2SMLZ and CMOW-DLNN
    Ramalingam, Parameshwaran
    Thanuja, R.
    Bhavani, R.
    Gopalakrishnan, Lakshminarayanan
    WIRELESS PERSONAL COMMUNICATIONS, 2022, 127 (03) : 2325 - 2345
  • [27] An Efficient Lossless Telemetry Data Compression and Fault Analysis System Using 2SMLZ and CMOW-DLNN
    Parameshwaran Ramalingam
    R. Thanuja
    R. Bhavani
    Lakshminarayanan Gopalakrishnan
    Wireless Personal Communications, 2022, 127 (3) : 2325 - 2345
  • [28] Design approach of FPGA based efficient data compression technique (TS D2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${D}^{2}$$\end{document} NIS) in FC technology for data transmission
    Krishna Chaithanya Muntha
    Manimaran Ponnusamy
    Multimedia Tools and Applications, 2024, 83 (11) : 32321 - 32341