Optimal Transport of Diverse Unsupervised Tasks for Robust Learning from Noisy Few-Shot Data

被引:0
|
作者
Que, Xiaofan [1 ]
Yu, Qi [1 ]
机构
[1] Rochester Inst Technol, Rochester, NY 14623 USA
来源
关键词
Noisy few-shot learning; Auxiliary task; Optimal transport;
D O I
10.1007/978-3-031-72933-1_17
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Noisy few-shot learning (NFSL) presents novel challenges primarily due to the interplay between noisy labels and limited training data. While data cleansing offers a viable solution to address noisy labels in the general learning settings, it exacerbates information loss in FSL due to limited training data, resulting in inadequate model training. To best recover the underlying task manifold corrupted by the noisy labels, we resort to learning from uniquely designed unsupervised auxiliary tasks to compensate for information loss. Using unsupervised tasks can effectively avoid additional annotation costs and minimize the risk of introducing additional label noises. However, a randomly constructed unsupervised task may misguide the model to learn sample-specific features that are likely to compromise the primary few-shot learning task due to the noisy weak learning signals. We propose to conduct novel auxiliary task selection to ensure the intra-diversity among the unlabeled samples within a task. Domain invariant features are then learned from carefully constructed auxiliary tasks to best recover the original data manifold. We conduct a theoretical analysis to derive novel generalization bounds for learning with auxiliary tasks. Extensive experiments are conducted to demonstrate that our method outperforms existing noisy few-shot learning methods under various in-domain and cross-domain few-shot classification benchmarks.
引用
收藏
页码:294 / 311
页数:18
相关论文
共 50 条
  • [1] Few-shot Learning with Noisy Labels
    Liang, Kevin J.
    Rangrej, Samrudhdhi B.
    Petrovic, Vladan
    Hassner, Tal
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9079 - 9088
  • [2] Revisiting Unsupervised Meta-Learning via the Characteristics of Few-Shot Tasks
    Ye, Han-Jia
    Han, Lu
    Zhan, De-Chuan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (03) : 3721 - 3737
  • [3] Unsupervised meta-learning for few-shot learning
    Xu, Hui
    Wang, Jiaxing
    Li, Hao
    Ouyang, Deqiang
    Shao, Jie
    PATTERN RECOGNITION, 2021, 116
  • [4] Dual-Level Curriculum Meta-Learning for Noisy Few-Shot Learning Tasks
    Que, Xiaofan
    Yu, Qi
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 14740 - 14748
  • [5] Robust Few-Shot Learning for User-Provided Data
    Lu, Jiang
    Jin, Sheng
    Liang, Jian
    Zhang, Changshui
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (04) : 1433 - 1447
  • [6] FlipDA: Effective and Robust Data Augmentation for Few-Shot Learning
    Zhou, Jing
    Zheng, Yanan
    Tang, Jie
    Li, Jian
    Yang, Zhilin
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 8646 - 8665
  • [7] Interval Bound Interpolation for Few-shot Learning with Few Tasks
    Datta, Shounak
    Mullick, Sankha Subhra
    Chakrabarty, Anish
    Das, Swagatam
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [8] Adaptive Distribution Calibration for Few-Shot Learning with Hierarchical Optimal Transport
    Guo, Dandan
    Tian, Long
    Zhao, He
    Zhou, Mingyuan
    Zha, Hongyuan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [9] Adaptive distribution calibration for few-shot learning via optimal transport
    Liu, Xin
    Zhou, Kairui
    Yang, Pengbo
    Jing, Liping
    Yu, Jian
    INFORMATION SCIENCES, 2022, 611 : 1 - 17
  • [10] Adaptive distribution calibration for few-shot learning via optimal transport
    Liu, Xin
    Zhou, Kairui
    Yang, Pengbo
    Jing, Liping
    Yu, Jian
    Information Sciences, 2022, 611 : 1 - 17