Enhancing network modularity to mitigate catastrophic forgetting

被引:2
|
作者
Chen, Lu [1 ]
Murata, Masayuki [2 ]
机构
[1] Kyoto Inst Technol, Dept Informat & Human Sci, Kyoto, Japan
[2] Osaka Univ, Dept Informat Sci & Technol, Osaka, Japan
关键词
Modularity; Catastrophic forgetting; Neural network;
D O I
10.1007/s41109-020-00332-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Catastrophic forgetting occurs when learning algorithms change connections used to encode previously acquired skills to learn a new skill. Recently, a modular approach for neural networks was deemed necessary as learning problems grow in scale and complexity since it intuitively should reduce learning interference by separating functionality into physically distinct network modules. However, an algorithmic approach is difficult in practice since it involves expert design and trial and error. Kashtan et al. finds that evolution under an environment that changes in a modular fashion leads to the spontaneous evolution of a modular network structure. In this paper, we aim to solve the reverse problem of modularly varying goal (MVG) to obtain a highly modular structure that can mitigate catastrophic forgetting so that it can also apply to realistic data. First, we confirm that a configuration with a highly modular structure exists by applying an MVG against a realistic dataset and confirm that this neural network can mitigate catastrophic forgetting. Next, we solve the reverse problem, that is, we propose a method that can obtain a highly modular structure able to mitigate catastrophic forgetting. Since the MVG-obtained neural network can relatively maintain the intra-module elements while leaving the inter-module elements relatively variable, we propose a method to restrict the inter-module weight elements so that they can be relatively variable against the intra-module ones. From the results, the obtained neural network has a highly modular structure and can learn an unlearned goal faster than without this method.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Enhancing network modularity to mitigate catastrophic forgetting
    Lu Chen
    Masayuki Murata
    Applied Network Science, 5
  • [2] Mitigate Catastrophic Forgetting by Varying Goals
    Chen, Lu
    Masayuki, Murata
    ICAART: PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 2, 2020, : 530 - 537
  • [3] Continual Learning for Instance Segmentation to Mitigate Catastrophic Forgetting
    Lee, Jeong Jun
    Lee, Seung Il
    Kim, Hyun
    18TH INTERNATIONAL SOC DESIGN CONFERENCE 2021 (ISOCC 2021), 2021, : 85 - 86
  • [4] An Improved Dual-Channel Network to Eliminate Catastrophic Forgetting
    Liu, Dongbo
    He, Zhenan
    Chen, Dongdong
    Lv, Jiancheng
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2022, 52 (01): : 415 - 425
  • [5] Avoiding Catastrophic Forgetting
    Hasselmo, Michael E.
    TRENDS IN COGNITIVE SCIENCES, 2017, 21 (06) : 407 - 408
  • [6] Hebbian learning rule restraining catastrophic forgetting in pulse neural network
    Motoki, M., 1600, John Wiley and Sons Inc. (151):
  • [7] Hebbian learning rule restraining catastrophic forgetting in pulse neural network
    Motoki, M
    Hamagami, T
    Koakutsu, S
    Hirata, H
    ELECTRICAL ENGINEERING IN JAPAN, 2005, 151 (03) : 50 - 60
  • [8] Catastrophic forgetting in connectionist networks
    French, RM
    TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) : 128 - 135
  • [9] How catastrophic can catastrophic forgetting be in linear regression?
    Evron, Itay
    Moroshko, Edward
    Ward, Rachel
    Srebro, Nati
    Soudry, Daniel
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [10] Solutions to the catastrophic forgetting problem
    Robins, A
    PROCEEDINGS OF THE TWENTIETH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1998, : 899 - 904