Advocacy Learning: Learning through Competition and Class-Conditional Representations

被引:0
|
作者
Fox, Ian [1 ]
Wiens, Jenna [1 ]
机构
[1] Univ Michigan, Dept Comp Sci & Engn, Ann Arbor, MI 48109 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce advocacy learning, a novel supervised training scheme for attention-based classification problems. Advocacy learning relies on a framework consisting of two connected networks: 1) N Advocates (one for each class), each of which outputs an argument in the form of an attention map over the input, and 2) a Judge, which predicts the class label based on these arguments. Each Advocate produces a class-conditional representation with the goal of convincing the Judge that the input example belongs to their class, even when the input belongs to a different class. Applied to several different classification tasks, we show that advocacy learning can lead to small improvements in classification accuracy over an identical supervised baseline. Though a series of follow-up experiments, we analyze when and how such class-conditional representations improve discriminative performance. Though somewhat counter-intuitive, a framework in which subnetworks are trained to competitively provide evidence in support of their class shows promise, in many cases performing on par with standard learning approaches. This provides a foundation for further exploration into competition and class-conditional representations in supervised learning.
引用
收藏
页码:2315 / 2321
页数:7
相关论文
共 50 条
  • [1] Learning a metric for class-conditional KNN
    Im, Daniel Jiwoong
    Taylor, Graham W.
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 1932 - 1939
  • [2] TRANSDUCTIVE CLIP WITH CLASS-CONDITIONAL CONTRASTIVE LEARNING
    Huang, Junchu
    Chen, Weijie
    Yang, Shicai
    Xie, Di
    Pu, Shiliang
    Zhuang, Yueting
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3858 - 3862
  • [3] Learning Class-Conditional GANs with Active Sampling
    Xie, Ming-Kun
    Huang, Sheng-Jun
    [J]. KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 998 - 1006
  • [4] Improving unsupervised domain adaptation through class-conditional compact representations
    Mohammad Rostami
    [J]. Neural Computing and Applications, 2024, 36 (25) : 15237 - 15254
  • [5] Naive and Robust: Class-Conditional Independence in Human Classification Learning
    Jarecki, Jana B.
    Meder, Bjoern
    Nelson, Jonathan D.
    [J]. COGNITIVE SCIENCE, 2018, 42 (01) : 4 - 42
  • [6] Class-conditional Importance Weighting for Deep Learning with Noisy Labels
    Nagarajan, Bhalaji
    Marques, Ricardo
    Mejia, Marcos
    Radeva, Petia
    [J]. PROCEEDINGS OF THE 17TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 5, 2022, : 679 - 686
  • [7] Unsupervised Keypoint Learning for Guiding Class-Conditional Video Prediction
    Kim, Yunji
    Nam, Seonghyeon
    Cho, In
    Kim, Seon Joo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [8] Compact class-conditional domain invariant learning for multi-class domain adaptation
    Lee, Woojin
    Kim, Hoki
    Lee, Jaewook
    [J]. PATTERN RECOGNITION, 2021, 112
  • [9] CCMN: A General Framework for Learning With Class-Conditional Multi-Label Noise
    Xie, Ming-Kun
    Huang, Sheng-Jun
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 154 - 166
  • [10] Weighted class-conditional Bayesian network classifier parameter learning of chaos quantum particle swarm
    Liu, Jiufu
    Ding, Xiaobin
    Zheng, Rui
    Wang, Biao
    Liu, Haiyang
    Wang, Zhisheng
    [J]. Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2019, 41 (10): : 2304 - 2309