共 50 条
- [43] Towards Efficient Post-training Quantization of Pre-trained Language Models ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [44] Bit-shrinking: Limiting Instantaneous Sharpness for Improving Post-training Quantization 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16196 - 16205
- [46] RepQ-ViT: Scale Reparameterization for Post-Training Quantization of Vision Transformers 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17181 - 17190
- [48] NoisyQuant: Noisy Bias-Enhanced Post-Training Activation Quantization for Vision Transformers 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 20321 - 20330
- [49] FGPTQ-ViT: Fine-Grained Post-training Quantization for Vision Transformers PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IX, 2024, 14433 : 79 - 90
- [50] Exploring Post-training Quantization in LLMs from Comprehensive Study to Low Rank Compensation THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 19377 - 19385