共 50 条
- [1] Mixture-of-Experts Learner for Single Long-Tailed Domain Generalization PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 290 - 299
- [2] MDCS: More Diverse Experts with Consistency Self-distillation for Long-tailed Recognition 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11563 - 11574
- [3] Balanced Product of Calibrated Experts for Long-Tailed Recognition 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 19967 - 19977
- [5] Self Supervision to Distillation for Long-Tailed Visual Recognition 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 610 - 619
- [6] MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 16067 - 16075
- [7] Virtual Student Distribution Knowledge Distillation for Long-Tailed Recognition PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT IV, 2025, 15034 : 406 - 419
- [8] Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts IEEE Transactions on Big Data, 2023, 9 (06): : 1683 - 1696
- [9] Relational Subsets Knowledge Distillation for Long-Tailed Retinal Diseases Recognition MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT VIII, 2021, 12908 : 3 - 12
- [10] VideoLT: Large-scale Long-tailed Video Recognition 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 7940 - 7949