Open set transfer learning through distribution driven active learning

被引:1
|
作者
Wang, Min [1 ]
Wen, Ting [1 ]
Jiang, Xiao-Yu [1 ]
Zhang, An-An [1 ]
机构
[1] Southwest Petr Univ, Sch Elect Engn & Informat, Chengdu 610500, Peoples R China
基金
中国国家自然科学基金;
关键词
Active learning; Transfer learning; Evidence learning; Uncertainty analysis;
D O I
10.1016/j.patcog.2023.110055
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain adaptation enables effective transfer between source and target domains with different distributions. The latest research focuses on open set domain adaptation; that is, the target domain contains unknown categories that do not exist in the source domain. The existing open set domain adaptation cannot realize the fine-grained recognition of unknown categories. In this paper, we propose an uncertainty analysis evidence model and design a distribution driven active transfer learning (DATL) algorithm. DATL realizes fine-grained recognition of unknown categories with no requirements on the source domain to contain the unknown categories. To explore unknown distributions, the uncertainty analysis evidence model was adopted to divide the high uncertainty space. To select critical instances, a cluster-diversity query strategy was proposed to identify new categories. To enrich the label categories of the source domain, a global dynamic alignment strategy was designed to avoid negative transfers. Comparative experiments with state-of-the-art methods on the standard Office-31/Office-Home/Office-Caltech10 benchmarks showed that the DATL algorithm: (1) outperformed its competitors; (2) realized accurate identification of unknown subcategories from a fine-grained perspective; and (3) achieved outstanding performance even with a very high degree of openness.
引用
收藏
页数:15
相关论文
共 50 条
  • [31] OSA-PD: Open-Set Active Learning Exploiting Predictive Discrepancy
    Chen, Zhiming
    Han, Peng
    Jiang, Fei
    Si, Jiaxin
    Xiong, Lili
    IEEE SIGNAL PROCESSING LETTERS, 2025, 32 : 851 - 855
  • [32] Logit prototype learning with active multimodal representation for robust open-set recognition
    Fu, Yimin
    Liu, Zhunga
    Wang, Zicheng
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (06)
  • [33] Logit prototype learning with active multimodal representation for robust open-set recognition
    Yimin FU
    Zhunga LIU
    Zicheng WANG
    Science China(Information Sciences), 2024, 67 (06) : 297 - 312
  • [34] Active Transfer Learning and Selective Instance Transfer with Active Learning for Motor Imagery based BCI
    Hossain, Ibrahim
    Khosravi, Abbas
    Nahavandhi, Saeid
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 4048 - 4055
  • [35] Active label distribution learning
    Dong, Xinyue
    Gu, Shilin
    Zhuge, Wenzhang
    Luo, Tingjin
    Hou, Chenping
    NEUROCOMPUTING, 2021, 436 : 12 - 21
  • [36] Shattering Distribution for Active Learning
    Cao, Xiaofeng
    Tsang, Ivor W.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (01) : 215 - 228
  • [37] Approximating Learning Curves for Active-Learning-Driven Annotation
    Tomanek, Katrin
    Hahn, Udo
    SIXTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, LREC 2008, 2008, : 1319 - 1324
  • [38] Constructing Learning Results as Learning Object Through Open Learning System
    Praherdhiono, Henry
    Adi, Eka Pramono
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON EDUCATION AND TRAINING (ICET 2017), 2017, 128 : 295 - 300
  • [39] Efficient Constraint Learning for Data-Driven Active Distribution Network Operation
    Chen, Ge
    Zhang, Hongcai
    Song, Yonghua
    IEEE TRANSACTIONS ON POWER SYSTEMS, 2024, 39 (01) : 1472 - 1484
  • [40] Goal-driven active learning
    Bougie, Nicolas
    Ichise, Ryutaro
    AUTONOMOUS AGENTS AND MULTI-AGENT SYSTEMS, 2021, 35 (02)