Brain-inspired Trustworthy Hyperdimensional Computing with Efficient Uncertainty Quantification

被引:2
|
作者
Ni, Yang [1 ]
Chen, Hanning [1 ]
Poduval, Prathyush [2 ]
Zou, Zhuowen [1 ]
Mercati, Pietro [3 ]
Imani, Mohsen [1 ]
机构
[1] Univ Calif Irvine, Irvine, CA 92717 USA
[2] Univ Maryland, College Pk, MD 20742 USA
[3] Intel Labs, Boston, MA USA
基金
美国国家科学基金会;
关键词
FPGA;
D O I
10.1109/ICCAD57390.2023.10323657
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Recent advancement in emerging brain-inspired computing has pointed out a promising path to Machine Learning (ML) algorithms with high efficiency. Particularly, research in the field of HyperDimensional Computing (HDC) brings orders of magnitude speedup to both ML model training and inference compared to their deep learning counterparts. However, current HDC-based ML algorithms generally lack uncertainty estimation, despite having shown good results in various practical applications and outstanding energy efficiency. On the other hand, existing solutions such as the Bayesian Neural Networks (BNN) are generally much slower than regular neural networks and lead to high energy consumption. In this paper, we propose a hyperdimensional Bayesian framework called DiceHD, which enables uncertainty estimation for the HDC-based regression algorithm. The core of our framework is a specially designed HDC encoder that maps input features to the high dimensional space with an extra layer of randomness, i.e., a small number of dimensions are randomly dropped for each input. Our key insight is that by using this encoder, DiceHD implements Bayesian inference while maintaining the efficiency advantage of HDC. We verify our framework with both toy regression tasks and realworld datasets. We compare our DiceHD to several widely-used BNN baselines in terms of performance and efficiency. The results on CPU show that DiceHD provides comparable uncertainty estimations while achieving significant speedup compared to the BNN baseline. We also deploy DiceHD on two FPGA platforms with different acceleration capabilities, showing that DiceHD provides up to 84x (3740x) better energy efficiency for training (inference).
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Brain-inspired computing becomes complete
    Rhodes, Oliver
    NATURE, 2020, 586 (7829) : 364 - 366
  • [42] A system hierarchy for brain-inspired computing
    Zhang, Youhui
    Qu, Peng
    Ji, Yu
    Zhang, Weihao
    Gao, Guangrong
    Wang, Guanrui
    Song, Sen
    Li, Guoqi
    Chen, Wenguang
    Zheng, Weimin
    Chen, Feng
    Pei, Jing
    Zhao, Rong
    Zhao, Mingguo
    Shi, Luping
    NATURE, 2020, 586 (7829) : 378 - +
  • [43] Memristive Synapses for Brain-Inspired Computing
    Wang, Jingrui
    Zhuge, Fei
    ADVANCED MATERIALS TECHNOLOGIES, 2019, 4 (03):
  • [44] Brain-inspired conscious computing architecture
    Duch, W
    JOURNAL OF MIND AND BEHAVIOR, 2005, 26 (1-2): : 1 - 21
  • [45] Brain-inspired computing and machine learning
    Lazaros S. Iliadis
    Vera Kurkova
    Barbara Hammer
    Neural Computing and Applications, 2020, 32 : 6641 - 6643
  • [46] Competing memristors for brain-inspired computing
    Kim, Seung Ju
    Kim, Sang Bum
    Jang, Ho Won
    ISCIENCE, 2021, 24 (01)
  • [47] Brain-inspired computing needs a master plan
    Mehonic, A.
    Kenyon, A. J.
    NATURE, 2022, 604 (7905) : 255 - 260
  • [48] An improved memristor model for brain-inspired computing
    周二瑞
    方粮
    刘汝霖
    汤振森
    Chinese Physics B, 2017, (11) : 541 - 547
  • [49] Brain-Inspired Computing with Spin Torque Devices
    Roy, Kaushik
    Sharad, Mrigank
    Fan, Deliang
    Yogendra, Karthik
    2014 DESIGN, AUTOMATION AND TEST IN EUROPE CONFERENCE AND EXHIBITION (DATE), 2014,
  • [50] An improved memristor model for brain-inspired computing
    Zhou, Errui
    Fang, Liang
    Liu, Rulin
    Tang, Zhenseng
    CHINESE PHYSICS B, 2017, 26 (11)