Class-specific differential detection in diffractive optical neural networks improves inference accuracy

被引:0
|
作者
Jingxi Li [1 ,2 ,3 ]
Deniz Mengu [1 ,2 ,3 ]
Yi Luo [1 ,2 ,3 ]
Yair Rivenson [1 ,2 ,3 ]
Aydogan Ozcan [1 ,2 ,3 ]
机构
[1] 不详
[2] University of California at Los Angeles, Department of Electrical and Computer Engineering
[3] 不详
[4] University of California at Los Angeles, Department of Bioengineering
[5] University of California at Los Angeles, California Nano Systems Institute
[6] 不详
关键词
D O I
暂无
中图分类号
TP183 [人工神经网络与计算]; O436.1 [干涉与衍射];
学科分类号
摘要
Optical computing provides unique opportunities in terms of parallelization, scalability, power efficiency, and computational speed and has attracted major interest for machine learning. Diffractive deep neural networks have been introduced earlier as an optical machine learning framework that uses task-specific diffractive surfaces designed by deep learning to all-optically perform inference, achieving promising performance for object classification and imaging. We demonstrate systematic improvements in diffractive optical neural networks, based on a differential measurement technique that mitigates the strict nonnegativity constraint of light intensity. In this differential detection scheme, each class is assigned to a separate pair of detectors, behind a diffractive optical network, and the class inference is made by maximizing the normalized signal difference between the photodetector pairs. Using this differential detection scheme, involving 10 photodetector pairs behind 5 diffractive layers with a total of 0.2 million neurons, we numerically achieved blind testing accuracies of 98.54%, 90.54%, and 48.51% for MNIST, Fashion-MNIST, and grayscale CIFAR-10 datasets, respectively. Moreover, by utilizing the inherent parallelization capability of optical systems, we reduced the cross-talk and optical signal coupling between the positive and negative detectors of each class by dividing the optical path into two jointly trained diffractive neural networks that work in parallel. We further made use of this parallelization approach and divided individual classes in a target dataset among multiple jointly trained diffractive neural networks. Using this class-specific differential detection in jointly optimized diffractive neural networks that operate in parallel, our simulations achieved blind testing accuracies of 98.52%, 91.48%, and 50.82% for MNIST, Fashion-MNIST, and grayscale CIFAR-10 datasets, respectively, coming close to the performance of some of the earlier generations of all-electronic deep neural networks, e.g., Le Net, which achieves classification accuracies of98.77%, 90.27%, and 55.21% corresponding to the same datasets, respectively. In addition to these jointly optimized diffractive neural networks, we also independently optimized multiple diffractive networks and utilized them in a way that is similar to ensemble methods practiced in machine learning; using 3 independently optimized differential diffractive neural networks that optically project their light onto a common output/detector plane, we numerically achieved blind testing accuracies of 98.59%, 91.06%, and 51.44% for MNIST, Fashion-MNIST, and grayscale CIFAR-10 datasets, respectively. Through these systematic advances in designing diffractive neural networks, the reported classification accuracies set the state of the art for all-optical neural network design. The presented framework might be useful to bring optical neural network-based low power solutions for various machine learning applications and help us design new computational cameras that are task-specific.
引用
收藏
页码:5 / 17
页数:13
相关论文
共 50 条
  • [31] Weight masking in image classification networks: class-specific machine unlearning
    Wang, Jiali
    Bie, Hongxia
    Jing, Zhao
    Zhi, Yichen
    Fan, Yongkai
    KNOWLEDGE AND INFORMATION SYSTEMS, 2025, : 3245 - 3265
  • [32] High accuracy inference by an optical neural network implementation
    Miura, Shun
    Otake, Mamoru
    Kusaka, Hiroyuki
    Kashiwagi, Masahiro
    Kunai, Yuichiro
    Nambara, Takahiro
    Yamada, Yumi
    SPIE FUTURE SENSING TECHNOLOGIES 2023, 2023, 12327
  • [33] A Class-Specific Intrusion Detection Model: Hierarchical Multi-class IDS Model
    Sarıkaya A.
    Kılıç B.G.
    SN Computer Science, 2020, 1 (4)
  • [34] Therapeutic class-specific signal detection of bradycardia associated with propranolol hydrochloride
    Gavali, Dhaval K.
    Kulkarni, Kala S.
    Kumar, Amal
    Chakraborty, Bhaswat S.
    INDIAN JOURNAL OF PHARMACOLOGY, 2009, 41 (04) : 162 - 166
  • [35] MINING HETEROGENEOUS CLASS-SPECIFIC CODEBOOK FOR CATEGORICAL OBJECT DETECTION AND CLASSIFICATION
    Pan, Hong
    Zhu, Yaping
    Qin, A. K.
    Xia, Liangzheng
    2013 20TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP 2013), 2013, : 3132 - 3136
  • [36] Detection of aflatoxins in tea samples based on a class-specific monoclonal antibody
    Zhang, Xun
    Feng, Min
    Liu, Liqiang
    Xing, Changrui
    Kuang, Hua
    Peng, Chianfang
    Wang, Libing
    Xu, Chuanlai
    INTERNATIONAL JOURNAL OF FOOD SCIENCE AND TECHNOLOGY, 2013, 48 (06): : 1269 - 1274
  • [37] DETECTION BY IMMUNOFLUORESCENCE OF ANTIBODIES TO PARASITIC AGENTS - USE OF CLASS-SPECIFIC CONJUGATES
    HULDT, G
    LJUNGSTROM, I
    AUSTKETTIS, A
    ANNALS OF THE NEW YORK ACADEMY OF SCIENCES, 1975, 254 : 304 - 314
  • [38] Classification Matters: Improving Video Action Detection with Class-Specific Attention
    Lee, Jinsung
    Kim, Taeoh
    Lee, Inwoong
    Shim, Minho
    Wee, Dongyoon
    Cho, Minsu
    Kwak, Suha
    COMPUTER VISION - ECCV 2024, PT XX, 2025, 15078 : 450 - 467
  • [39] Generalized design of diffractive optical elements using neural networks
    Pasupuleti, A
    Gopalan, A
    Sahin, F
    Abushagur, MAG
    PHOTONICS NORTH: APPLICATIONS OF PHOTONIC TECHNOLOGY, PTS 1 AND 2: CLOSING THE GAP BETWEEN THEORY, DEVELOPMENT, AND APPLICATION, 2004, 5579 : 315 - 322
  • [40] Broad-spectrum protein biosensors for class-specific detection of antibiotics
    Weber, CC
    Link, N
    Fux, C
    Zisch, AH
    Weber, W
    Fussenegger, M
    BIOTECHNOLOGY AND BIOENGINEERING, 2005, 89 (01) : 9 - 17