Cascading photonic reservoirs with deep neural networks increases computational performance

被引:0
|
作者
Bauwens, Ian [1 ,2 ]
Van der Sande, Guy [1 ]
Bienstman, Peter [2 ]
Verschaffelt, Guy [1 ]
机构
[1] Vrije Univ Brussel, Appl Phys Res Grp, Pl Laan 2, B-1050 Brussels, Belgium
[2] Univ Ghent, Dept Informat Technol, Photon Res Grp, IMEC, Technol Pk Zwijnaarde 126, B-9052 Ghent, Belgium
来源
MACHINE LEARNING IN PHOTONICS | 2024年 / 13017卷
关键词
Preprocessor; deep neural network; delay-based reservoir computing; machine learning; semiconductor laser; feedback;
D O I
10.1117/12.3017209
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks (DNNs) have been successfully applied to solve complex problems, such as pattern recognition when analyzing big data. To achieve a good computational performance, these networks are often designed such that they contain a large number of trainable parameters. However, by doing so, DNNs are often very energy-intensive and time-consuming to train. In this work, we propose to use a photonic reservoir to preprocess the input data instead of directly injecting it into the DNN. A photonic reservoir consists of a network of many randomly connected nodes which do not need to be trained. It forms an additional layer to the deep neural network and can transform the input data into a state in a higher dimensional state-space. This allows us to reduce the size of the DNN, and the amount of training required for the DNN. We test this assumption using numerical simulations that show that such a photonic reservoir as preprocessor results in an improved performance, shown by a lower test error, for a deep neural network, when tested on the one-step ahead prediction task of the Santa Fe time-series. The performance of the stand-alone DNN is poor on this task, resulting in a high test error. As we also discuss in detail in [Bauwens et al, Frontiers in Physics 10, 1051941 (2022)], we conclude that photonic reservoirs are well-suited as physical preprocessors to deep neural networks for tackling time-dependent tasks due to their fast computation times and low-energy consumption.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Using photonic reservoirs as preprocessors for deep neural networks
    Bauwens, Ian
    van der Sande, Guy
    Bienstman, Peter
    Verschaffelt, Guy
    FRONTIERS IN PHYSICS, 2022, 10
  • [2] Using photonic reservoirs as preprocessors for deep neural networks
    Bauwens, Ian
    Van der Sande, Guy
    Bienstman, Peter
    Verschaffelt, Guy
    Frontiers in Physics, 2022, 10
  • [3] Learning with Deep Photonic Neural Networks
    Leelar, Bhawani Shankar
    Shivaleela, E. S.
    Srinivas, T.
    2017 IEEE WORKSHOP ON RECENT ADVANCES IN PHOTONICS (WRAP), 2017,
  • [4] The Incoherence of Deep Isotropic Neural Networks Increases Their Performance in Image Classification
    Feng, Wenfeng
    Zhang, Xin
    Song, Qiushuang
    Sun, Guoying
    ELECTRONICS, 2022, 11 (21)
  • [5] Computational mammography using deep neural networks
    Dubrovina, A.
    Kisilev, P.
    Ginsburg, B.
    Hashoul, S.
    Kimmel, R.
    COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING-IMAGING AND VISUALIZATION, 2018, 6 (03): : 243 - 247
  • [6] Deep neural networks for the evaluation and design of photonic devices
    Jiaqi Jiang
    Mingkun Chen
    Jonathan A. Fan
    Nature Reviews Materials, 2021, 6 : 679 - 700
  • [7] Deep neural networks for the evaluation and design of photonic devices
    Jiang, Jiaqi
    Chen, Mingkun
    Fan, Jonathan A.
    NATURE REVIEWS MATERIALS, 2021, 6 (08) : 679 - 700
  • [8] History matching of petroleum reservoirs using deep neural networks
    Alguliyev, Rasim
    Aliguliyev, Ramiz
    Imamverdiyev, Yadigar
    Sukhostat, Lyudmila
    INTELLIGENT SYSTEMS WITH APPLICATIONS, 2022, 16
  • [9] On the Reduction of Computational Complexity of Deep Convolutional Neural Networks
    Maji, Partha
    Mullins, Robert
    ENTROPY, 2018, 20 (04)
  • [10] Outlook on deep neural networks in computational cognitive neuroscience
    Turner, Brandon M.
    Miletic, Steven
    Forstmann, Birte U.
    NEUROIMAGE, 2018, 180 : 117 - 118