Efficient training and design of photonic neural network through neuroevolution

被引:55
|
作者
Zhang, Tian [1 ]
Wang, Jia [1 ]
Dan, Yihang [1 ]
Lanqiu, Yuxiang [1 ]
Dai, Jian [1 ]
Han, Xu [2 ]
Sun, Xiaojuan [3 ]
Xu, Kun [1 ]
机构
[1] Beijing Univ Posts & Telecommun, State Key Lab Informat Photon & Opt Commun, Beijing 100876, Peoples R China
[2] Huawei Technol Co Ltd, Shenzhen 518129, Guangdong, Peoples R China
[3] Beijing Univ Posts & Telecommun, Sch Sci, Beijing 100876, Peoples R China
来源
OPTICS EXPRESS | 2019年 / 27卷 / 26期
基金
中国国家自然科学基金;
关键词
GENETIC ALGORITHM; INVERSE DESIGN; OPTIMIZATION;
D O I
10.1364/OE.27.037150
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Recently, optical neural networks (ONNs) integrated into photonic chips have received extensive attention because they are expected to implement the same pattern recognition tasks in electronic platforms with high efficiency and low power consumption. However, there are no efficient learning algorithms for the training of ONNs on an on-chip integration system. In this article, we propose a novel learning strategy based on neuroevolution to design and train ONNs. Two typical neuroevolution algorithms are used to determine the hyper-parameters of ONNs and to optimize the weights (phase shifters) in the connections. To demonstrate the effectiveness of the training algorithms, the trained ONNs are applied in classification tasks for an iris plants dataset, a wine recognition dataset and modulation formats recognition. The calculated results demonstrate that the accuracy and stability of the training algorithms based on neuroevolution are competitive with other traditional learning algorithms. In comparison to previous works, we introduce an efficient training method for ONNs and demonstrate their broad application prospects in pattern recognition, reinforcement learning and so on. (C) 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
引用
收藏
页码:37150 / 37163
页数:14
相关论文
共 50 条
  • [1] Efficient self-learning of photonic neural network through neuroevolution
    Wang, Jia
    Zhang, Tian
    Dan, Yihang
    Hu, Furong
    Lanqiu, Yuxiang
    Dai, Jian
    Xu, Kun
    [J]. 2019 ASIA COMMUNICATIONS AND PHOTONICS CONFERENCE (ACP), 2019,
  • [2] Neuroevolution Guided Hybrid Spiking Neural Network Training
    Lu, Sen
    Sengupta, Abhronil
    [J]. FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [3] Designing neural networks through neuroevolution
    Kenneth O. Stanley
    Jeff Clune
    Joel Lehman
    Risto Miikkulainen
    [J]. Nature Machine Intelligence, 2019, 1 : 24 - 35
  • [4] Designing neural networks through neuroevolution
    Stanley, Kenneth O.
    Clune, Jeff
    Lehman, Joel
    Miikkulainen, Risto
    [J]. NATURE MACHINE INTELLIGENCE, 2019, 1 (01) : 24 - 35
  • [5] Hybrid Neural Network for Efficient Training
    Hossain, Md. Billal
    Islam, Sayeed
    Zhumur, Noor-e-Hafsa
    Khanam, Najmoon Nahar
    Khan, Md. Imran
    Kabir, Md. Ahasan
    [J]. 2017 INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER AND COMMUNICATION ENGINEERING (ECCE), 2017, : 528 - 532
  • [6] A Review on Convolutional Neural Network Encodings for Neuroevolution
    Vargas-Hakim, Gustavo-Adolfo
    Mezura-Montes, Efren
    Acosta-Mesa, Hector-Gabriel
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2022, 26 (01) : 12 - 27
  • [7] Training of Photonic Neural Networks through In Situ Backpropagation
    Hughes, Tyler W.
    Minkov, Momchil
    Williamson, Ian A. D.
    Shi, Yu
    Fan, Shanhui
    [J]. 2019 CONFERENCE ON LASERS AND ELECTRO-OPTICS (CLEO), 2019,
  • [8] Data Throughput for Efficient Photonic Neural Network Accelerators
    Schwartz, Russell L. T.
    Jahannia, Belal
    Peserico, Nicola
    Dalir, Hamed
    Sorger, Volker J.
    [J]. 2024 IEEE SILICON PHOTONICS CONFERENCE, SIPHOTONICS, 2024,
  • [9] Memory Efficient Deep Neural Network Training
    Shilova, Alena
    [J]. EURO-PAR 2021: PARALLEL PROCESSING WORKSHOPS, 2022, 13098 : 515 - 519
  • [10] Mutual Information-Based Neural Network Distillation for Improving Photonic Neural Network Training
    Chariton, Alexandros
    Passalis, Nikolaos
    Pleros, Nikos
    Tefas, Anastasios
    [J]. NEURAL PROCESSING LETTERS, 2023, 55 (07) : 8589 - 8604