A novel method to compute the weights of neural networks

被引:10
|
作者
Gao, Zhentao [1 ]
Chen, Yuanyuan [1 ]
Yi, Zhang [1 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Machine Intelligence Lab, Chengdu 610065, Peoples R China
基金
中国国家自然科学基金;
关键词
Neural networks; Gradient free; Closed-form solution; White box models;
D O I
10.1016/j.neucom.2020.03.114
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks are the main strength of modern artificial intelligence; they have demonstrated revolu-tionary performance in a wide range of applications. In practice, the weights of neural networks are gen-erally obtained indirectly using iterative training methods. Such methods are inefficient and problematic in many respects. Besides, neural networks trained end-to-end by such methods are typical black box models that are hard to interpret. Thus, it would be significantly better if the weights of a neural network could be calculated directly. In this paper, we located the key for calculating the weights of a neural net-work directly: assigning proper targets to the hidden units. Furthermore, if such targets are assigned, the neural network becomes a white box model that is easy to interpret. Thus, we propose a framework for solving the weights of a neural network and provide a sample implementation of the framework. The implementation was tested in various classification and regression experiments. Compared with neural networks trained using traditional methods, the constructed ones using solved weights had similar or better performance on many tasks, while remaining interpretable. Given the early stage of the proposed approach, many improvements are expectable in future developments. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:409 / 427
页数:19
相关论文
共 50 条
  • [21] A new method in determining initial weights of feedforward neural networks for training enhancement
    Yam, YF
    Chow, TWS
    Leung, CT
    NEUROCOMPUTING, 1997, 16 (01) : 23 - 32
  • [22] A novel method for the evolution of artificial neural networks
    Dept. of Computer Engineering, University of Patras, Patras 26500, Hellas
    不详
    不详
    Adv. Intell. Syst. Comput. Sci., (35-40):
  • [23] A Novel Method for Function Smoothness in Neural Networks
    Lindqvist, Blerta
    IEEE ACCESS, 2022, 10 : 75354 - 75364
  • [24] DropWeak: A novel regularization method of neural networks
    El Korchi, Anas
    Ghanou, Youssf
    PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING IN DATA SCIENCES (ICDS2017), 2018, 127 : 102 - 108
  • [25] A novel cluster method in fuzzy neural networks
    Li, DQ
    Huang, SB
    2002 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-4, PROCEEDINGS, 2002, : 261 - 265
  • [26] An indirect variable weights method to compute fuzzy comprehensive evaluation values
    Zhonglin Chai
    Soft Computing, 2019, 23 : 12511 - 12519
  • [27] An indirect variable weights method to compute fuzzy comprehensive evaluation values
    Chai, Zhonglin
    SOFT COMPUTING, 2019, 23 (23) : 12511 - 12519
  • [28] Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
    Hubara, Itay
    Courbariaux, Matthieu
    Soudry, Daniel
    El-Yaniv, Ran
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [29] Some neural networks compute, others don't
    Piccinini, Gualtiero
    NEURAL NETWORKS, 2008, 21 (2-3) : 311 - 321
  • [30] Euclidean Contractivity of Neural Networks With Symmetric Weights
    Centorrino, Veronica
    Gokhale, Anand
    Davydov, Alexander
    Russo, Giovanni
    Bullo, Francesco
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 1724 - 1729