ReFine: Re-randomization before Fine-tuning for Cross-domain Few-shot Learning

被引:4
|
作者
Oh, Jaehoon [1 ]
Kim, Sungnyun [2 ]
Ho, Namgyu [2 ]
Kim, Jin-Hwa [3 ]
Song, Hwanjun [3 ]
Yun, Se-Young [2 ]
机构
[1] KAIST DS, Daejeon, South Korea
[2] KAIST AI, Seoul, South Korea
[3] NAVER AI Lab, Sungnam, South Korea
关键词
cross-domain; few-shot; transfer learning; re-randomization;
D O I
10.1145/3511808.3557681
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cross-domain few-shot learning (CD-FSL), where there are few target samples under extreme differences between source and target domains, has recently attracted huge attention. Recent studies on CD-FSL generally focus on transfer learning based approaches, where a neural network is pre-trained on popular labeled source domain datasets and then transferred to target domain data. Although the labeled datasets may provide suitable initial parameters for the target data, the domain difference between the source and target might hinder fine-tuning on the target domain. This paper proposes a simple yet powerful method that re-randomizes the parameters fitted on the source domain before adapting to the target data. The re-randomization resets source-specific parameters of the source pre-trained model and thus facilitates fine-tuning on the target domain, improving few-shot performance.
引用
收藏
页码:4359 / 4363
页数:5
相关论文
共 50 条
  • [41] Cross-domain few-shot learning via adaptive transformer networks
    Paeedeh, Naeem
    Pratama, Mahardhika
    Ma'sum, Muhammad Anwar
    Mayer, Wolfgang
    Cao, Zehong
    Kowlczyk, Ryszard
    Knowledge-Based Systems, 2024, 288
  • [42] Adapt Before Comparison: A New Perspective on Cross-Domain Few-Shot Segmentation
    Herzog, Jonas
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 23605 - 23615
  • [43] Attentive fine-grained recognition for cross-domain few-shot classification
    Sa, Liangbing
    Yu, Chongchong
    Ma, Xianqin
    Zhao, Xia
    Xie, Tao
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (06): : 4733 - 4746
  • [44] Attentive fine-grained recognition for cross-domain few-shot classification
    Liangbing Sa
    Chongchong Yu
    Xianqin Ma
    Xia Zhao
    Tao Xie
    Neural Computing and Applications, 2022, 34 : 4733 - 4746
  • [45] EFTNet: an efficient fine-tuning method for few-shot segmentation
    Li, Jiaguang
    Wang, Yubo
    Gao, Zihan
    Wei, Ying
    APPLIED INTELLIGENCE, 2024, 54 (19) : 9488 - 9507
  • [46] DOMAIN-AGNOSTIC META-LEARNING FOR CROSS-DOMAIN FEW-SHOT CLASSIFICATION
    Lee, Wei-Yu
    Wang, Jheng-Yu
    Wang, Yu-Chiang Frank
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1715 - 1719
  • [47] Convert Cross-Domain Classification Into Few-Shot Learning: A Unified Prompt-Tuning Framework for Unsupervised Domain Adaptation
    Zhu, Yi
    Shen, Hui
    Li, Yun
    Qiang, Jipeng
    Yuan, Yunhao
    Wu, Xindong
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2025, 9 (01): : 810 - 821
  • [48] Cross-Domain Few-Shot Relation Extraction via Representation Learning and Domain Adaptation
    Yuan, Zhongju
    Wang, Zhenkun
    Li, Genghui
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [49] Partial Is Better Than All: Revisiting Fine-tuning Strategy for Few-shot Learning
    Shen, Zhiqiang
    Liu, Zechun
    Qin, Jie
    Savvides, Marios
    Cheng, Kwang-Ting
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9594 - 9602
  • [50] COMPARING THE EFFICACY OF FINE-TUNING AND META-LEARNING FOR FEW-SHOT POLICY IMITATION
    Patacchiola, Massimiliano
    Sun, Mingfei
    Hofmann, Katja
    Turner, Richard E.
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 232, 2023, 232 : 878 - 908