共 50 条
- [41] Ouroboros: On Accelerating Training of Transformer-Based Language Models [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [44] Blockwise compression of transformer-based models without retraining [J]. NEURAL NETWORKS, 2024, 171 : 423 - 428
- [45] Transformer-Based Federated Learning Models for Recommendation Systems [J]. IEEE ACCESS, 2024, 12 : 109596 - 109607
- [46] A Comparison of Transformer-Based Language Models on NLP Benchmarks [J]. NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2022), 2022, 13286 : 490 - 501
- [50] TAG: Gradient Attack on Transformer-based Language Models [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3600 - 3610