Using quasirandom weights in neural networks

被引:0
|
作者
Anderson, PG [1 ]
Ge, M [1 ]
Raghavendra, S [1 ]
Lung, ML [1 ]
机构
[1] Rochester Inst Technol, Dept Comp Sci & Imaging Sci, Rochester, NY 14623 USA
来源
关键词
neural networks; hidden layers; quasirandom weights;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a novel training algorithm for a feed forward neural network with a single hidden layer of nodes (i.e., two layers of connection weights). Our algorithm is capable of training networks for hard problems, such as the classic two-spirals problem. The weights in the first layer are determined using a quasirandom number generator. These weights are frozen-they are never modified during the training process. The second layer of weights is trained as a simple linear discriminator using methods such as the pseudoinverse, with possible iterations. We also study the problem of reducing the hidden layer: pruning low-weight nodes and a genetic algorithm search for good subsets.
引用
收藏
页码:61 / 71
页数:11
相关论文
共 50 条
  • [31] Structural damage detection using the optimal weights of the approximating artificial neural networks
    Hung, SL
    Kao, CY
    EARTHQUAKE ENGINEERING & STRUCTURAL DYNAMICS, 2002, 31 (02): : 217 - 234
  • [32] Leukocyte Image Segmentation Using Feed Forward Neural Networks with Random Weights
    Cao, Feilong
    Lu, Jing
    Chu, Jianjun
    Zhou, Zhenghua
    Zhao, Jianwei
    Chen, Guoqiang
    2015 11TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION (ICNC), 2015, : 736 - 742
  • [33] Euclidean Contractivity of Neural Networks With Symmetric Weights
    Centorrino, Veronica
    Gokhale, Anand
    Davydov, Alexander
    Russo, Giovanni
    Bullo, Francesco
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 1724 - 1729
  • [34] Routes to chaos in neural networks with random weights
    Albers, DJ
    Sprott, JC
    Dechert, WD
    INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS, 1998, 8 (07): : 1463 - 1478
  • [35] Learning Neural Networks without Lazy Weights
    Lee, Dong-gi
    Cho, Junhee
    Kim, Myungjun
    Park, Sunghong
    Shin, Hyunjung
    2022 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (IEEE BIGCOMP 2022), 2022, : 82 - 87
  • [36] Determination of weights for relaxation recurrent neural networks
    Serpen, G
    Livingston, DL
    NEUROCOMPUTING, 2000, 34 : 145 - 168
  • [37] Neural Networks Between Integer and Rational Weights
    Sima, Jiri
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 154 - 161
  • [38] Hardness of Learning Neural Networks with Natural Weights
    Daniely, Amit
    Vardi, Gal
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [39] NEURAL NETWORKS WITH UNIPOLAR WEIGHTS AND NORMALIZED THRESHOLDS
    BRODKA, JS
    MACUKOW, B
    OPTICAL COMPUTING, 1995, 139 : 463 - 466
  • [40] Neural Networks with Superexpressive Activations and Integer Weights
    Beknazaryan, Aleksandr
    INTELLIGENT COMPUTING, VOL 2, 2022, 507 : 445 - 451