共 50 条
- [22] Pre-trained models for natural language processing: A survey [J]. Science China Technological Sciences, 2020, 63 : 1872 - 1897
- [25] Leveraging pre-trained language models for code generation [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3955 - 3980
- [26] Towards a Transformer-Based Pre-trained Model for IoT Traffic Classification [J]. PROCEEDINGS OF 2024 IEEE/IFIP NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM, NOMS 2024, 2024,
- [27] A Robust Approach to Fine-tune Pre-trained Transformer-based models for Text Summarization through Latent Space Compression [J]. 2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 160 - 167
- [29] Extremely Low Resource Text simplification with Pre-trained Transformer Language Model [J]. PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 53 - 58