共 50 条
- [1] Self-Supervised Quantization of Pre-Trained Neural Networks for Multiplierless Acceleration [J]. 2019 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2019, : 1094 - 1099
- [2] Enhancing Pre-trained Language Models by Self-supervised Learning for Story Cloze Test [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 271 - 279
- [3] KNOWLEDGE DISTILLATION FOR NEURAL TRANSDUCERS FROM LARGE SELF-SUPERVISED PRE-TRAINED MODELS [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8527 - 8531
- [4] Self-supervised Bidirectional Prompt Tuning for Entity-enhanced Pre-trained Language Model [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
- [5] A Study of Pre-trained Language Models in Natural Language Processing [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [8] Pre-trained models for natural language processing: A survey [J]. Science China Technological Sciences, 2020, 63 : 1872 - 1897
- [10] SPIQ: A Self-Supervised Pre-Trained Model for Image Quality Assessment [J]. IEEE Signal Processing Letters, 2022, 29 : 513 - 517