共 50 条
- [32] MoME: Mixture-of-Masked-Experts for Efficient Multi-Task Recommendation [J]. PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 2527 - 2531
- [33] Beyond Distillation: Task-level Mixture-of-Experts for Efficient Inference [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3577 - 3599
- [34] A hybrid approach model for Weather Forecasting using Multi-Task agent [J]. 2015 2ND INTERNATIONAL CONFERENCE ON ELECTRONICS AND COMMUNICATION SYSTEMS (ICECS), 2015, : 1675 - 1678
- [35] Hierarchical strategy of model partitioning for VLSI-design using an improved mixture of experts approach [J]. TENTH WORKSHOP ON PARALLEL AND DISTRIBUTED SIMULATION - PADS 96, PROCEEDINGS, 1996, : 106 - 113
- [37] A MIXTURE OF EXPERTS APPROACH TOWARDS INTELLIGIBILITY CLASSIFICATION OF PATHOLOGICAL SPEECH [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 1986 - 1990
- [39] Mixture of Experts Approach for Behavioral Modeling of RF Power Amplifiers [J]. 2021 IEEE TOPICAL CONFERENCE ON RF/MICROWAVE POWER AMPLIFIERS FOR RADIO AND WIRELESS APPLICATIONS (PAWR), 2021, : 1 - 3
- [40] Unseen Family Member Classification Using Mixture of Experts [J]. ICIEA 2010: PROCEEDINGS OF THE 5TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOL 1, 2010, : 359 - +