Gradient Sparsification For Masked Fine-Tuning of Transformers

被引:0
|
作者
O'Neill, James [1 ]
Dutta, Sourav [1 ]
机构
[1] Huawei Ireland Res Ctr, Dublin, Ireland
关键词
neural nets; sparse regularization; fine-tuning;
D O I
10.1109/IJCNN54540.2023.10191206
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fine-tuning pretrained self-supervised language models is widely adopted for transfer learning to downstream tasks. Fine-tuning can be achieved by freezing gradients of the pretrained network and only updating gradients of a newly added classification layer, or by performing gradient updates on all parameters. Gradual unfreezing makes a trade-off between the two by gradually unfreezing gradients of whole layers during training. This has been an effective strategy to trade-off between storage and training speed with generalization performance. However, it is not clear whether gradually unfreezing layers throughout training is optimal, compared to sparse variants of gradual unfreezing which may improve fine-tuning performance. In this paper, we propose to stochastically mask gradients to regularize pretrained language models for improving overall fine-tuned performance. We introduce GradDrop and variants thereof, a class of gradient sparsification methods that mask gradients during the backward pass, acting as gradient noise. GradDrop is sparse and stochastic unlike gradual freezing. Extensive experiments on the multilingual XGLUE benchmark with XLMR-Large show that GradDrop is competitive against methods that use additional translated data for intermediate pretraining and outperforms standard fine-tuning and gradual unfreezing. A post-analysis shows how GradDrop improves performance with languages it was not trained on, such as under-resourced languages.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Fine-tuning the Universe
    Haskell, Merrie
    NATURE, 2009, 461 (7260) : 134 - 134
  • [32] Fine-tuning imperative
    Roessler, D
    INTECH, 2004, 51 (06) : 22 - 26
  • [33] Fine-tuning adaptation
    Jabri, E
    NATURE STRUCTURAL & MOLECULAR BIOLOGY, 2005, 12 (05): : 394 - 394
  • [34] Fine-tuning the Universe
    Merrie Haskell
    Nature, 2009, 461 : 134 - 134
  • [35] STRING FINE-TUNING
    SAVVIDY, GK
    SAVVIDY, KG
    INTERNATIONAL JOURNAL OF MODERN PHYSICS A, 1993, 8 (22): : 3993 - 4011
  • [36] FINE-TUNING SUPERALLOYS
    不详
    ADVANCED MATERIALS & PROCESSES, 2021, 179 (04): : 13 - 13
  • [37] Fine-Tuning the Truth
    Elisabeth E. Elder
    Raimund Jakesz
    World Journal of Surgery, 2011, 35 : 2185 - 2186
  • [38] Fine-Tuning the Future
    Youngner, Stuart J.
    HASTINGS CENTER REPORT, 2010, 40 (03) : 7 - 8
  • [39] Fine-tuning polylactides
    O'Driscoll, C
    CHEMISTRY IN BRITAIN, 2001, 37 (06) : 25 - 26
  • [40] Gauging fine-tuning
    Azhar, Feraz
    Loeb, Abraham
    PHYSICAL REVIEW D, 2018, 98 (10)