Backpropagation Training in Adaptive Quantum Networks

被引:7
|
作者
Altman, Christopher [2 ]
Zapatrin, Roman R. [1 ]
机构
[1] State Russian Museum, Dept Informat, St Petersburg 191186, Russia
[2] Delft Univ Technol, Kavli Inst Nanosci, NL-2600 AA Delft, Netherlands
关键词
Neural networks; Quantum topology; Adaptive learning;
D O I
10.1007/s10773-009-0103-1
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We introduce a robust, error-tolerant adaptive training algorithm for generalized learning paradigms in high-dimensional superposed quantum networks, or adaptive quantum networks The formalized procedure applies standard backpropagation training across a coherent ensemble of discrete topological configurations of individual neural networks, each of which is formally merged into appropriate linear superposition within a predefined, decoherence-free subspace Quantum parallelism facilitates simultaneous training and revision of the system within this coherent state space, resulting in accelerated convergence to a stable network attractor under consequent iteration of the Implemented backpropagation algorithm Parallel evolution of linear superposed networks incorporating backpropagation training provides quantitative, numerical indications for optimization of both single-neuron activation functions and optimal reconfiguration of whole-network quantum structure
引用
收藏
页码:2991 / 2997
页数:7
相关论文
共 50 条
  • [31] Multi-way backpropagation networks for training compact deep neural networks
    Guo, Yong
    Chen, Jian
    Du, Qing
    Van Den Hengel, Anton
    Shi, Qinfeng
    Tan, Mingkui
    NEURAL NETWORKS, 2020, 126 : 250 - 261
  • [32] In situ optical backpropagation training of diffractive optical neural networks
    Zhou, Tiankuang
    Fang, Lu
    Yan, Tao
    Wu, Jiamin
    Li, Yipeng
    Fan, Jingtao
    Wu, Huaqiang
    Lin, Xing
    Dai, Qionghai
    PHOTONICS RESEARCH, 2020, 8 (06) : 940 - 953
  • [33] In situ optical backpropagation training of diffractive optical neural networks
    TIANKUANG ZHOU
    LU FANG
    TAO YAN
    JIAMIN WU
    YIPENG LI
    JINGTAO FAN
    HUAQIANG WU
    XING LIN
    QIONGHAI DAI
    Photonics Research, 2020, 8 (06) : 940 - 953
  • [34] Training fuzzy number neural networks using constrained backpropagation
    Dunyak, J
    Guven, M
    Wunsch, D
    1998 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AT THE IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE - PROCEEDINGS, VOL 1-2, 1998, : 1142 - 1146
  • [35] In situ optical backpropagation training of diffractive optical neural networks
    TIANKUANG ZHOU
    LU FANG
    TAO YAN
    JIAMIN WU
    YIPENG LI
    JINGTAO FAN
    HUAQIANG WU
    XING LIN
    QIONGHAI DAI
    Photonics Research , 2020, (06) : 940 - 953
  • [36] Training Spiking Neural Networks with Event-driven Backpropagation
    Zhu, Yaoyu
    Yu, Zhaofei
    Fang, Wei
    Xie, Xiaodong
    Huang, Tiejun
    Masquelier, Timothee
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [37] Nonlinear Compensation with Modified Adaptive Digital Backpropagation in Flexigrid Networks
    da Silva, Edson Porto
    Asif, Rameez
    Larsen, Knud J.
    Zibar, Darko
    2015 CONFERENCE ON LASERS AND ELECTRO-OPTICS (CLEO), 2015,
  • [38] Training neural networks with end-to-end optical backpropagation
    Spall, James
    Guo, Xianxin
    Lvovsky, Alexander I.
    ADVANCED PHOTONICS, 2025, 7 (01):
  • [39] Backpropagation-free training of deep physical neural networks
    Momeni, Ali
    Rahmani, Babak
    Mallejac, Matthieu
    del Hougne, Philipp
    Fleury, Romain
    SCIENCE, 2023, 382 (6676) : 1297 - 1303
  • [40] Training neural networks with end-to-end optical backpropagation
    James Spall
    Xianxin Guo
    Alexander ILvovsky
    Advanced Photonics, 2025, 7 (01) : 35 - 44