共 50 条
- [1] Detecting Syntactic Change with Pre-trained Transformer Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 3564 - 3574
- [2] On the effect of dropping layers of pre-trained transformer models COMPUTER SPEECH AND LANGUAGE, 2022, 77
- [5] Deep Entity Matching with Pre-Trained Language Models PROCEEDINGS OF THE VLDB ENDOWMENT, 2020, 14 (01): : 50 - 60
- [6] Pre-Trained Image Processing Transformer 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 12294 - 12305
- [7] Compression of Generative Pre-trained Language Models via Quantization PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4821 - 4836
- [8] Simple and Effective Multimodal Learning Based on Pre-Trained Transformer Models IEEE ACCESS, 2022, 10 : 29821 - 29833
- [10] Leveraging Generative Pre-Trained Transformer Models for Standardizing Nursing Data 2024 IEEE 12TH INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS, ICHI 2024, 2024, : 386 - 391