Learning from Few Samples with Memory Network

被引:1
|
作者
Zhang, Shufei [1 ]
Huang, Kaizhu [1 ]
机构
[1] Xian Jiaotong Liverpool Univ, SIP, Dept EEE, Suzhou 215123, Peoples R China
关键词
Memory; Multi-layer perceptron; STYLE;
D O I
10.1007/978-3-319-46687-3_67
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural Networks (NN) have achieved great success in pattern recognition and machine learning. However, the success of NNs usually relies on a sufficiently large number of samples. When fed with limited data, NN's performance may be degraded significantly. In this paper, we introduce a novel neural network called Memory Network, which can learn better from limited data. Taking advantages of the memory from previous samples, the new model could achieve remarkable performance improvement on limited data. We demonstrate the memory network in Multi-Layer Perceptron (MLP). However, it keeps straightforward to extend our idea to other neural networks, e.g., Convolutional Neural Networks (CNN). We detail the network structure, present the training algorithm, and conduct a series of experiments to validate the proposed framework. Experimental results show that our model outperforms the traditional MLP and other competitive algorithms in two real data sets.
引用
收藏
页码:606 / 614
页数:9
相关论文
共 50 条
  • [1] Learning from Few Samples with Memory Network
    Zhang, Shufei
    Huang, Kaizhu
    Zhang, Rui
    Hussain, Amir
    [J]. COGNITIVE COMPUTATION, 2018, 10 (01) : 15 - 22
  • [2] Learning from Few Samples with Memory Network
    Shufei Zhang
    Kaizhu Huang
    Rui Zhang
    Amir Hussain
    [J]. Cognitive Computation, 2018, 10 : 15 - 22
  • [3] A survey on machine learning from few samples
    Lu, Jiang
    Gong, Pinghua
    Ye, Jieping
    Zhang, Jianwei
    Zhang, Changshui
    [J]. PATTERN RECOGNITION, 2023, 139
  • [4] Learning Convolutional Neural Networks From Few Samples
    Wagner, Raimar
    Thom, Markus
    Schweiger, Roland
    Palm, Guenther
    Rothermel, Albrecht
    [J]. 2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [5] Memory-Augmented Relation Network for Few-Shot Learning
    He, Jun
    Hong, Richang
    Liu, Xueliang
    Xu, Mingliang
    Zha, Zheng-Jun
    Wang, Meng
    [J]. MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 1236 - 1244
  • [6] MANNWARE: A Malware Classification Approach with a Few Samples Using a Memory Augmented Neural Network
    Tran, Kien
    Sato, Hiroshi
    Kubo, Masao
    [J]. INFORMATION, 2020, 11 (01)
  • [7] Learning People Detection Models from Few Training Samples
    Pishchulin, Leonid
    Jain, Arjun
    Wojek, Christian
    Andriluka, Mykhaylo
    Thormaehlen, Thorsten
    Schiele, Bernt
    [J]. 2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011, : 1473 - 1480
  • [8] Transfer Learning and Few-Shot Learning Based Deep Neural Network Models for Underwater Sonar Image Classification With a Few Samples
    Chungath, Tincy Thomas
    Nambiar, Athira M. M.
    Mittal, Anurag
    [J]. IEEE JOURNAL OF OCEANIC ENGINEERING, 2024, 49 (01) : 294 - 310
  • [9] Dictionary Learning With Few Samples and Matrix Concentration
    Luh, Kyle
    Vu, Van
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2016, 62 (03) : 1516 - 1527
  • [10] PSOM network: Learning with few examples
    Walter, JA
    [J]. 1998 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-4, 1998, : 2054 - 2059