Advancing Analog Reservoir Computing through Temporal Attention and MLP Integration

被引:0
|
作者
Sedki, Khalil [1 ]
Yi, Yang Cindy [1 ]
机构
[1] Virginia Tech, Bradley Dept ECE, Blacksburg, VA 24061 USA
基金
美国国家科学基金会;
关键词
Delay-Feedback Reservoir (DFR); Mackey-Glass (MG) nonlinear function; temporal encoder; delay-feedback loop; Time to first spike encoding (TTFS); Interspike interval encoding (ISI); neuromorphic computing; attention mechanism; Multilayer Perceptron (MLP); backpropagation; MEMORY;
D O I
10.1109/ISQED60706.2024.10528762
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents a novel approach for Image classification, integrating analog Delay Feedback Reservoir (DFR), Temporal Attention Mechanism, Multi-Layer Perceptron (MLP), and backpropagation. The DFR system simplifies recurrent neural networks by focusing on the readout stage, offering enhanced performance and adaptability. The study details the design of an analog DFR system for low-power embedded applications, which utilizes a temporal encoder, Mackey-Glass nonlinear module, and dynamic delayed feedback loop to efficiently process sequential inputs with minimal power consumption. This system, implemented in standard GF 22nm CMOS FD-SOI technology, achieves high energy efficiency and a compact design area. It exhibits promise in emulating mammalian brain behavior, with only a remarkable 155 mu W power consumption and design area of 0.0044mm(2). In addition, this paper introduces a temporal attention mechanism that operates directly on continuous analog signals. The attention mechanism enhances the DFR system's ability to capture relevant temporal patterns. Furthermore, our approach incorporates the MLP for post-processing the DFR output. This comprehensive approach integrates DFR, Temporal Attention Mechanism and MLP via backpropagation, advancing the development of computationally-efficient Reservoir Computing (RC) systems for image classification with 98.96% accuracy.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Interpretable Graph Reservoir Computing With the Temporal Pattern Attention
    Han, Xinyu
    Zhao, Yi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 9198 - 9212
  • [2] Attention-enhanced reservoir computing
    Koester, Felix
    Kanno, Kazutaka
    Ohkubo, Jun
    Uchida, Atsushi
    PHYSICAL REVIEW APPLIED, 2024, 22 (01):
  • [3] Advancing Drone Operations through Lightweight Blockchain and Fog Computing Integration: A Systematic Review
    Aldossri, Rawabi
    Aljughaiman, Ahmed
    Albuali, Abdullah
    DRONES, 2024, 8 (04)
  • [4] ADVANCING CONNECTIONIST TEMPORAL CLASSIFICATION WITH ATTENTION MODELING
    Das, Amit
    Li, Jinyu
    Zhao, Rui
    Gong, Yifan
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 4769 - 4773
  • [5] The role of attention in temporal integration
    Visser, TAW
    Enns, JT
    PERCEPTION, 2001, 30 (02) : 135 - 145
  • [6] Neuromorphic Analog Implementation of Reservoir Computing for Machine Learning
    Hazan, Avi
    Tsur, Elishai Ezra
    2022 29TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (IEEE ICECS 2022), 2022,
  • [7] Visualising Temporal Data Using Reservoir Computing
    Wang, Tzai-Der
    Fyfe, Colin
    JOURNAL OF INFORMATION SCIENCE AND ENGINEERING, 2013, 29 (04) : 695 - 709
  • [8] An Optoelectronic Reservoir Computing for Temporal Information Processing
    Du, Wen
    Li, Caihong
    Huang, Yixuan
    Zou, Jihua
    Luo, Lingzhi
    Teng, Caihong
    Kuo, Hao-Chung
    Wu, Jiang
    Wang, Zhiming
    IEEE ELECTRON DEVICE LETTERS, 2022, 43 (03) : 406 - 409
  • [9] The effects of spatial attention on temporal integration
    Hochmitz, Ilanit
    Lauffs, Marc M.
    Herzog, Michael H.
    Yeshurun, Yaffa
    PERCEPTION, 2016, 45 : 258 - 258
  • [10] Temporal integration requires attention.
    Shore, DI
    Enns, JT
    DiLollo, V
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 1996, 37 (03) : 2427 - 2427