A Gradient-Descent Calibration Method to Mitigate Process Variations in Analog Synapse Arrays

被引:1
|
作者
Baek, Seung-Heon [1 ]
Kim, Jaeha [1 ]
机构
[1] Seoul Natl Univ, Dept Elect & Comp Engn, Seoul, South Korea
关键词
neuron; synapse; accuracy; gradient descent; calibration;
D O I
10.1109/ICEIC54506.2022.9748777
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A technique to compensate the effects of process variation in analog-based neural computation circuit arrays is presented. The proposed technique considers the process variation that degrades the accuracy of a neural network as a variable for optimization and adjusts the circuit parameters via a gradient descent method. In the TensorFlow framework, the effects of process variation on the neural network are modeled and the improvement in the inference accuracy achieved by the proposed technique is analyzed. For a 4-layer CNN MNIST example driven for 1000 different variations sets, the technique restores the accuracy to the original level, which can degrade to 17.3 similar to 95.2% due to process variations.
引用
收藏
页数:4
相关论文
共 33 条
  • [21] Robustness of Iteratively Pre-Conditioned Gradient-Descent Method: The Case of Distributed Linear Regression Problem
    Chakrabarti, Kushal
    Gupta, Nirupam
    Chopra, Nikhil
    2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 2248 - 2253
  • [22] LINEAR ANALOG OF THE GRADIENT DESCENT METHOD UNDER CONDITIONS OF CONSTRAINED OPTIMIZATION
    PANIN, VM
    DOKLADY AKADEMII NAUK SSSR, 1986, 291 (04): : 786 - 788
  • [23] Parameter estimation of various Hodgkin-Huxley-type neuronal models using a gradient-descent learning method
    Doi, SJ
    Onoda, Y
    Kumagai, S
    SICE 2002: PROCEEDINGS OF THE 41ST SICE ANNUAL CONFERENCE, VOLS 1-5, 2002, : 1685 - 1688
  • [24] Iterative pre-conditioning for expediting the distributed gradient-descent method: The case of linear least-squares problem
    Chakrabarti, Kushal
    Gupta, Nirupam
    Chopra, Nikhil
    AUTOMATICA, 2022, 137
  • [25] DVL error calibration method based on Gradient Descent Quaternion estimation theory
    Xu X.
    Yang Y.
    Li Y.
    Zhongguo Guanxing Jishu Xuebao/Journal of Chinese Inertial Technology, 2019, 27 (04): : 448 - 453
  • [26] Optimized spatially modulated polarimetry with an efficient calibration method and hybrid gradient descent reconstruction
    Ning, Tianlei
    Li, Yanqiu
    Zhou, Guodong
    Sun, Yiyu
    Liu, Ke
    APPLIED OPTICS, 2022, 61 (09) : 2267 - 2274
  • [27] Noise-Shaping Gradient Descent-Based Online Adaptation Algorithms for Digital Calibration of Analog Circuits
    Chakrabartty, Shantanu
    Shaga, Ravi K.
    Aono, Kenji
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (04) : 554 - 565
  • [28] Incorporating Derivative-Free Convexity with Trigonometric Simplex Designs for Learning-Rate Estimation of Stochastic Gradient-Descent Method
    Tokgoz, Emre
    Musafer, Hassan
    Faezipour, Miad
    Mahmood, Ausif
    ELECTRONICS, 2023, 12 (02)
  • [29] A Comparison of the Embedding Method With Multiparametric Programming, Mixed-Integer Programming, Gradient-Descent, and Hybrid Minimum Principle-Based Methods
    Meyer, Richard T.
    Zefran, Milos
    DeCarlo, Raymond A.
    IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, 2014, 22 (05) : 1784 - 1800
  • [30] An Efficient Method for Evaluating Analog Circuit Performance Bounds Under Process Variations
    Kuo, Po-Yu
    Saibua, Siwat
    Huang, Guanming
    Zhou, Dian
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2012, 59 (06) : 351 - 355