Black-Box Prompt Tuning With Subspace Learning

被引:0
|
作者
Zheng, Yuanhang [1 ]
Tan, Zhixing [2 ]
Li, Peng [3 ,4 ]
Liu, Yang [1 ,3 ,4 ,5 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Zhongguancun Lab, Beijing 100086, Peoples R China
[3] Tsinghua Univ, Inst AI Ind Res AIR, Beijing 100084, Peoples R China
[4] Shanghai Artificial Intelligence Lab, Shanghai 200030, Peoples R China
[5] Jiangsu Collaborat Innovat Ctr Language Competence, Xuzhou 221116, Jiangsu, Peoples R China
基金
国家重点研发计划;
关键词
Task analysis; Tuning; Closed box; Speech processing; Metalearning; Sun; Optimization; Black-box; large language models (LLMs); meta-learning; prompt tuning; subspace learning; ADAPTATION;
D O I
10.1109/TASLP.2024.3407519
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Black-box prompt tuning employs derivative-free optimization algorithms to learn prompts within low-dimensional subspaces rather than back-propagating through the network of Large Language Models (LLMs). Recent studies reveal that black-box prompt tuning lacks versatility across tasks and LLMs, which we believe is related to the suboptimal choice of subspaces. In this paper, we introduce Black-box prompt tuning with Subspace Learning (BSL) to enhance the versatility of black-box prompt tuning. Based on the assumption that nearly optimal prompts for similar tasks reside in a common subspace, we propose identifying such subspaces through meta-learning on a collection of similar source tasks. Consequently, for a target task that shares similarities with the source tasks, we expect that optimizing within the identified subspace can yield a prompt that performs well on the target task. Experimental results confirm that our BSL framework consistently achieves competitive performance across various downstream tasks and LLMs.
引用
收藏
页码:3002 / 3013
页数:12
相关论文
共 50 条
  • [1] Black-box Prompt Tuning for Vision-Language Model as a Service
    Yu, Lang
    Chen, Qin
    Lin, Jiaju
    He, Liang
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1686 - 1694
  • [2] Black-box learning of multigrid parameters
    Katrutsa, Alexandr
    Daulbaev, Talgat
    Oseledets, Ivan
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2020, 368 (368)
  • [3] Black-box electronics and passive learning
    Hess, Karl
    PHYSICS TODAY, 2014, 67 (02) : 11 - 12
  • [4] Active Learning in Black-Box Settings
    Rubens, Neil
    Sheinman, Vera
    Tomioka, Ryota
    Sugiyama, Masashi
    AUSTRIAN JOURNAL OF STATISTICS, 2011, 40 (1-2) : 125 - 135
  • [5] Adaptive Hyperparameter Tuning for Black-box LiDAR Odometry
    Koide, Kenji
    Yokozuka, Masashi
    Oishi, Shuji
    Banno, Atsuhiko
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 7708 - 7714
  • [6] Exploring the vulnerability of black-box adversarial attack on prompt-based learning in language models
    Zihao Tan
    Qingliang Chen
    Wenbin Zhu
    Yongjian Huang
    Chen Liang
    Neural Computing and Applications, 2025, 37 (3) : 1457 - 1473
  • [7] Black-Box Tuning for Language-Model-as-a-Service
    Sun, Tianxiang
    Shao, Yunfan
    Qian, Hong
    Huang, Xuanjing
    Qiu, Xipeng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [8] TrojLLM: A Black-box Trojan Prompt Attack on Large Language Models
    Xue, Jiaqi
    Zheng, Mengxin
    Hua, Ting
    Shen, Yilin
    Liu, Yepeng
    Boloni, Ladislau
    Lou, Qian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] THE BLACK-BOX
    KYLE, SA
    NEW SCIENTIST, 1986, 110 (1512) : 61 - 61
  • [10] THE BLACK-BOX
    WISEMAN, J
    ECONOMIC JOURNAL, 1991, 101 (404): : 149 - 155