Learning from Few Samples with Memory Network

被引:0
|
作者
Shufei Zhang
Kaizhu Huang
Rui Zhang
Amir Hussain
机构
[1] Xi’an Jiaotong-Liverpool University,Department of Electrical and Electronics Engineering, SIP
[2] Xi’an Jiaotong-Liverpool University,Department of Mathematical Sciences, SIP
[3] University of Stirling,Computing Science and Mathematics
来源
Cognitive Computation | 2018年 / 10卷
关键词
Memory; Multi-layer perceptron; Neural network; Recognition; Prior knowledge;
D O I
暂无
中图分类号
学科分类号
摘要
Neural networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN’s performance may be degraded significantly. In this paper, a novel NN structure is proposed called a memory network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Taking advantage of the memory from previous samples, the new model achieves a remarkable improvement in performance when trained using limited data. The memory network is demonstrated here using the multi-layer perceptron (MLP) as a base model. However, it would be straightforward to extend the idea to other neural networks, e.g., convolutional neural networks (CNN). In this paper, the memory network structure is detailed, the training algorithm is presented, and a series of experiments are conducted to validate the proposed framework. Experimental results show that the proposed model outperforms traditional MLP-based models as well as other competitive algorithms in response to two real benchmark data sets.
引用
收藏
页码:15 / 22
页数:7
相关论文
共 50 条
  • [1] Learning from Few Samples with Memory Network
    Zhang, Shufei
    Huang, Kaizhu
    Zhang, Rui
    Hussain, Amir
    [J]. COGNITIVE COMPUTATION, 2018, 10 (01) : 15 - 22
  • [2] Learning from Few Samples with Memory Network
    Zhang, Shufei
    Huang, Kaizhu
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT I, 2016, 9947 : 606 - 614
  • [3] A survey on machine learning from few samples
    Lu, Jiang
    Gong, Pinghua
    Ye, Jieping
    Zhang, Jianwei
    Zhang, Changshui
    [J]. PATTERN RECOGNITION, 2023, 139
  • [4] Learning Convolutional Neural Networks From Few Samples
    Wagner, Raimar
    Thom, Markus
    Schweiger, Roland
    Palm, Guenther
    Rothermel, Albrecht
    [J]. 2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [5] Memory-Augmented Relation Network for Few-Shot Learning
    He, Jun
    Hong, Richang
    Liu, Xueliang
    Xu, Mingliang
    Zha, Zheng-Jun
    Wang, Meng
    [J]. MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 1236 - 1244
  • [6] MANNWARE: A Malware Classification Approach with a Few Samples Using a Memory Augmented Neural Network
    Tran, Kien
    Sato, Hiroshi
    Kubo, Masao
    [J]. INFORMATION, 2020, 11 (01)
  • [7] Learning People Detection Models from Few Training Samples
    Pishchulin, Leonid
    Jain, Arjun
    Wojek, Christian
    Andriluka, Mykhaylo
    Thormaehlen, Thorsten
    Schiele, Bernt
    [J]. 2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011, : 1473 - 1480
  • [8] Transfer Learning and Few-Shot Learning Based Deep Neural Network Models for Underwater Sonar Image Classification With a Few Samples
    Chungath, Tincy Thomas
    Nambiar, Athira M. M.
    Mittal, Anurag
    [J]. IEEE JOURNAL OF OCEANIC ENGINEERING, 2024, 49 (01) : 294 - 310
  • [9] Dictionary Learning With Few Samples and Matrix Concentration
    Luh, Kyle
    Vu, Van
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (03) : 1516 - 1527
  • [10] PSOM network: Learning with few examples
    Walter, JA
    [J]. 1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, 1998, : 2054 - 2059