Large Product Key Memory for Pretrained Language Models

被引:0
|
作者
Kim, Gyuwan [1 ]
Jung, Tae-Hwan [1 ,2 ]
机构
[1] NAVER Corp, Clova AI, Seongnam, South Korea
[2] Kyung Hee Univ, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Product key memory (PKM) proposed by Lample et al. (2019) enables to improve prediction accuracy by increasing model capacity efficiently with insignificant computational overhead. However, their empirical application is only limited to causal language modeling. Motivated by the recent success of pretrained language models (PLMs), we investigate how to incorporate large PKM into PLMs that can be finetuned for a wide variety of downstream NLP tasks. We define a new memory usage metric, and careful observation using this metric reveals that most memory slots remain outdated during the training of PKMaugmented models. To train better PLMs by tackling this issue, we propose simple but effective solutions: (1) initialization from the model weights pretrained without memory and (2) augmenting PKM by addition rather than replacing a feed-forward network. We verify that both of them are crucial for the pretraining of PKM-augmented PLMs, enhancing memory utilization and downstream performance. Code and pretrained weights are available at https://github.com/clovaai/pkm-transformers.
引用
收藏
页码:4060 / 4069
页数:10
相关论文
共 50 条
  • [1] Using Large Pretrained Language Models for Answering User Queries from Product Specifications
    Roy, Kalyani
    Shah, Smit
    Pai, Nithish
    Ramtej, Jaidam
    Nadkarn, Prajit Prashant
    Banerjee, Jyotirmoy
    Goyal, Pawan
    Kumar, Surender
    [J]. WORKSHOP ON E-COMMERCE AND NLP (ECNLP 3), 2020, : 35 - 39
  • [2] Application of Pretrained Large Language Models in Embodied Artificial Intelligence
    A. K. Kovalev
    A. I. Panov
    [J]. Doklady Mathematics, 2022, 106 : S85 - S90
  • [3] Generalized Planning in PDDL Domains with Pretrained Large Language Models
    Silver, Tom
    Dan, Soham
    Srinivas, Kavitha
    Tenenbaum, Joshua B.
    Kaelbling, Leslie
    Katz, Michael
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 18, 2024, : 20256 - 20264
  • [4] Application of Pretrained Large Language Models in Embodied Artificial Intelligence
    Kovalev, A. K.
    Panov, A. I.
    [J]. DOKLADY MATHEMATICS, 2022, 106 (SUPPL 1) : S85 - S90
  • [5] A Survey of Pretrained Language Models
    Sun, Kaili
    Luo, Xudong
    Luo, Michael Y.
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT II, 2022, 13369 : 442 - 456
  • [6] Geographic Adaptation of Pretrained Language Models
    Hofmann, Valentin
    Glavas, Goran
    Ljubesic, Nikola
    Pierrehumbert, Janet B.
    Schuetze, Hinrich
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 411 - 431
  • [7] Generating Datasets with Pretrained Language Models
    Schick, Timo
    Schuetze, Hinrich
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6943 - 6951
  • [8] Discourse Probing of Pretrained Language Models
    Koto, Fajri
    Lau, Jey Han
    Baldwin, Timothy
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 3849 - 3864
  • [9] Investigating Transferability in Pretrained Language Models
    Tamkin, Alex
    Singh, Trisha
    Giovanardi, Davide
    Goodman, Noah
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1393 - 1401
  • [10] Textually Pretrained Speech Language Models
    Hassid, Michael
    Remez, Tal
    Nguyen, Tu Anh
    Gat, Itai
    Conneau, Alexis
    Kreuk, Felix
    Copet, Jade
    Defossez, Alexandre
    Synnaeve, Gabriel
    Dupoux, Emmanuel
    Schwartz, Roy
    Adi, Yossi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,