Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks

被引:19
|
作者
Kleyko, Denis [1 ,2 ]
Kheffache, Mansour [3 ]
Frady, E. Paxon [1 ]
Wiklund, Urban [4 ]
Osipov, Evgeny [5 ]
机构
[1] Univ Calif Berkeley, Redwood Ctr Theoret Neurosci, Berkeley, CA 94720 USA
[2] Res Inst Sweden, Intelligent Syst Lab, S-16440 Kista, Sweden
[3] Netlight Consulting AB, S-11153 Stockholm, Sweden
[4] Umea Univ, Dept Radiat Sci, Biomed Engn, S-90187 Umea, Sweden
[5] Lulea Univ Technol, Dept Comp Sci Elect & Space Engn, S-97187 Lulea, Sweden
基金
欧盟地平线“2020”; 瑞典研究理事会;
关键词
Density-based encoding; hyperdimensional computing; random vector functional link (RVFL) networks; APPROXIMATION;
D O I
10.1109/TNNLS.2020.3015971
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The deployment of machine learning algorithms on resource-constrained edge devices is an important challenge from both theoretical and applied points of view. In this brief, we focus on resource-efficient randomly connected neural networks known as random vector functional link (RVFL) networks since their simple design and extremely fast training time make them very attractive for solving many applied classification tasks. We propose to represent input features via the density-based encoding known in the area of stochastic computing and use the operations of binding and bundling from the area of hyperdimensional computing for obtaining the activations of the hidden neurons. Using a collection of 121 real-world data sets from the UCI machine learning repository, we empirically show that the proposed approach demonstrates higher average accuracy than the conventional RVFL. We also demonstrate that it is possible to represent the readout matrix using only integers in a limited range with minimal loss in the accuracy. In this case, the proposed approach operates only on small n-bits integers, which results in a computationally efficient architecture. Finally, through hardware field-programmable gate array (FPGA) implementations, we show that such an approach consumes approximately 11 times less energy than that of the conventional RVFL.
引用
收藏
页码:3777 / 3783
页数:7
相关论文
共 50 条
  • [41] Resource-efficient DAG Blockchain with Sharding for 6G Networks
    Xie, Jin
    Zhang, Ke
    Lu, YunLong
    Zhang, Yan
    IEEE NETWORK, 2022, 36 (01): : 189 - 196
  • [42] A Resource-Efficient Multiplierless Systolic Array Architecture for Convolutions in Deep Networks
    Parmar, Yashrajsinh
    Sridharan, K.
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2020, 67 (02) : 370 - 374
  • [43] Resource-efficient authentic key establishment in heterogeneous wireless sensor networks
    Shi, Qi
    Zhang, Ning
    Merabti, Madjid
    Kifayat, Kashif
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2013, 73 (02) : 235 - 249
  • [44] Intrinsic Cellular Properties and Connectivity Density Determine Variable Clustering Patterns in Randomly Connected Inhibitory Neural Networks
    Rich, Scott
    Booth, Victoria
    Zochowski, Michal
    FRONTIERS IN NEURAL CIRCUITS, 2016, 10
  • [45] Training Randomly Connected, Recurrent Artificial Neural Networks Using PSO
    Jancauskas, Vytautas
    SWARM INTELLIGENCE, 2010, 6234 : 578 - 579
  • [46] Noise-induced coherent oscillations in randomly connected neural networks
    Pham, J
    Pakdaman, K
    Vibert, JF
    PHYSICAL REVIEW E, 1998, 58 (03) : 3610 - 3622
  • [47] STORAGES PROPERTIES OF RANDOMLY CONNECTED BOOLEAN NEURAL NETWORKS FOR ASSOCIATIVE MEMORY
    WONG, KYM
    SHERRINGTON, D
    EUROPHYSICS LETTERS, 1988, 7 (03): : 197 - 202
  • [48] Efficient densely connected convolutional neural networks
    Li, Guoqing
    Zhang, Meng
    Li, Jiaojie
    Lv, Feng
    Tong, Guodong
    PATTERN RECOGNITION, 2021, 109
  • [49] Efficient Spiking Neural Networks With Radix Encoding
    Wang, Zhehui
    Gu, Xiaozhe
    Goh, Rick Siow Mong
    Zhou, Joey Tianyi
    Luo, Tao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (03) : 3689 - 3701
  • [50] Stimulus number, duration and intensity encoding in randomly connected attractor networks with synaptic depression
    Miller, Paul
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2013, 7