Stochastic Deep Networks with Linear Competing Units for Model-Agnostic Meta-Learning

被引:0
|
作者
Kalais, Konstantinos [1 ]
Chatzis, Sotirios [1 ]
机构
[1] Cyprus Univ Technol, Dept Elect Eng Comp Eng & Informat, Limassol, Cyprus
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work addresses meta-learning (ML) by considering deep networks with stochastic local winner-takes-all (LWTA) activations. This type of network units results in sparse representations from each model layer, as the units are organized into blocks where only one unit generates a non-zero output. The main operating principle of the introduced units rely on stochastic principles, as the network performs posterior sampling over competing units to select the winner. Therefore, the proposed networks are explicitly designed to extract input data representations of sparse stochastic nature, as opposed to the currently standard deterministic representation paradigm. Our approach produces stateof-the-art predictive accuracy on few-shot image classification and regression experiments, as well as reduced predictive error on an active learning setting; these improvements come with an immensely reduced computational cost. Code is available at: https://github.com/ Kkalais/StochLWTA-ML
引用
收藏
页码:10586 / 10597
页数:12
相关论文
共 50 条
  • [1] Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
    Finn, Chelsea
    Abbeel, Pieter
    Levine, Sergey
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [2] Probabilistic Model-Agnostic Meta-Learning
    Finn, Chelsea
    Xu, Kelvin
    Levine, Sergey
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Bayesian Model-Agnostic Meta-Learning
    Yoon, Jaesik
    Kim, Taesup
    Dia, Ousmane
    Kim, Sungwoong
    Bengio, Yoshua
    Ahn, Sungjin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [4] Knowledge Distillation for Model-Agnostic Meta-Learning
    Zhang, Min
    Wang, Donglin
    Gai, Sibo
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1355 - 1362
  • [5] Meta weight learning via model-agnostic meta-learning
    Xu, Zhixiong
    Chen, Xiliang
    Tang, Wei
    Lai, Jun
    Cao, Lei
    [J]. NEUROCOMPUTING, 2021, 432 : 124 - 132
  • [6] Combining Model-Agnostic Meta-Learning and Transfer Learning for Regression
    Satrya, Wahyu Fadli
    Yun, Ji-Hoon
    [J]. SENSORS, 2023, 23 (02)
  • [7] Task-Robust Model-Agnostic Meta-Learning
    Collins, Liam
    Mokhtari, Aryan
    Shakkottai, Sanjay
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [8] Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning
    Raymond, Christian
    Chen, Qi
    Xue, Bing
    Zhang, Mengjie
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 13699 - 13714
  • [9] Uncertainty in Model-Agnostic Meta-Learning using Variational Inference
    Nguyen, Cuong
    Do, Thanh-Toan
    Carneiro, Gustavo
    [J]. 2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 3079 - 3089
  • [10] Model-Agnostic Meta-Learning for Multilingual Hate Speech Detection
    Awal, Md Rabiul
    Lee, Roy Ka-Wei
    Tanwar, Eshaan
    Garg, Tanmay
    Chakraborty, Tanmoy
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (01): : 1086 - 1095