TranSlider: Transfer Ensemble Learning from Exploitation to Exploration

被引:7
|
作者
Zhong, Kuo [1 ,2 ]
Wei, Ying [1 ]
Yuan, Chun [2 ,3 ]
Bai, Haoli [4 ]
Huang, Junzhou [1 ]
机构
[1] Tencent AI Lab, Shenzhen, Peoples R China
[2] Tsinghua Shenzhen Int Grad Sch, Shenzhen, Peoples R China
[3] Peng Cheng Lab, Shenzhen, Peoples R China
[4] Chinese Univ Hong Kong, Hong Kong, Peoples R China
来源
KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING | 2020年
关键词
Transfer Learning; Ensemble Learning; Exploitation; Exploration;
D O I
10.1145/3394486.3403079
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In transfer learning, what and where to transfer has been widely studied. Nevertheless, the learned transfer strategies are at high risk of over-fitting, especially when only a few annotated instances are available in the target domain. In this paper, we introduce the concept of transfer ensemble learning, a new direction to tackle the over-fitting of transfer strategies. Intuitively, models with different transfer strategies offer various perspectives on what and where to transfer. Therefore a core problem is to search these diversely transferred models for ensemble so as to achieve better generalization. Towards this end, we propose the Transferability Slider (TranSlider) for transfer ensemble learning. By decreasing the transferability, we obtain a spectrum of base models ranging from pure exploitation of the source model to unconstrained exploration for the target domain. Furthermore, the manner of decreasing transferability with parameter sharing guarantees fast optimization at no additional training cost. Finally, we conduct extensive experiments with various analyses, which demonstrate that TranSlider achieves the state-of-the-art on comprehensive benchmark datasets.
引用
收藏
页码:368 / 378
页数:11
相关论文
共 50 条