B-LNN: Inference-time linear model for secure neural network inference

被引:3
|
作者
Wang, Qizheng [1 ,2 ]
Ma, Wenping [1 ]
Wang, Weiwei [1 ]
机构
[1] Xidian Univ, Sch Commun Engn, Xian, Peoples R China
[2] Shandong Inspur Sci Res Inst Co Ltd, Jinan, Peoples R China
关键词
Neural networks; Activation function; Privacy protection; Secure neural network inference;
D O I
10.1016/j.ins.2023.118966
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Machine Learning as a Service (MLaaS) provides clients with well-trained neural networks for predicting private data. Conventional prediction processes of MLaaS require clients to send sensitive inputs to the server, or proprietary models must be stored on the client-side device. The former reveals client privacy, while the latter harms the interests of model providers. Existing works on privacy-preserving MLaaS introduce cryptographic primitives to allow two parties to perform neural network inference without revealing either party's data. However, nonlinear activation functions bring high computational overhead and response delays to the inference process of these schemes.In this paper, we analyze the mechanism by which activation functions enhance model expressivity, and design an activation function S -cos that is friendly to secure neural network inference. Our proposed S -cos can be re-parameterized into a linear layer during the inference phase. Further, we propose an inference-time linear model called Beyond Linear Neural Network (B-LNN) equipped with S -cos, which exhibits promising performance on several benchmark datasets.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Swift: Fast Secure Neural Network Inference With Fully Homomorphic Encryption
    Fu, Yu
    Tong, Yu
    Ning, Yijing
    Xu, Tianshi
    Li, Meng
    Lin, Jingqiang
    Feng, Dengguo
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2025, 20 : 2793 - 2806
  • [22] Multilevel Neural Network for Reducing Expected Inference Time
    Putra, Tryan Aditya
    Leu, Jenq-Shiou
    IEEE ACCESS, 2019, 7 : 174129 - 174138
  • [23] LEARNING INFERENCE-TIME DRIFT SENSOR-ACTUATOR FOR DOMAIN GENERALIZATION
    Chen, Shuoshuo
    Tang, Yushun
    Kan, Zhehan
    He, Zhihai
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 5090 - 5094
  • [24] Computational Complexity of Neural Network Linear Layer Inference Optimizations
    Pendl, Klaus
    Rudic, Branislav
    4TH INTERDISCIPLINARY CONFERENCE ON ELECTRICS AND COMPUTER, INTCEC 2024, 2024,
  • [25] Tranception: Protein Fitness Prediction with Autoregressive Transformers and Inference-time Retrieval
    Notin, Pascal
    Dias, Mafalda
    Frazer, Jonathan
    Marchena-Hurtado, Javier
    Gomez, Aidan
    Marks, Debora S.
    Gal, Yarin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [26] Fuzzy inference neural network for fuzzy model tuning
    Lee, KM
    Kwak, DH
    LeeKwang, H
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1996, 26 (04): : 637 - 645
  • [27] Fuzzy inference neural network
    Nishina, T
    Hagiwara, M
    NEUROCOMPUTING, 1997, 14 (03) : 223 - 239
  • [28] Simultaneous inference of a partially linear model in time series
    Li, Jiaqi
    Chen, Likai
    Kim, Kun Ho
    Zhou, Tianwei
    JOURNAL OF TIME SERIES ANALYSIS, 2024,
  • [29] Secure and Verifiable Inference in Deep Neural Networks
    Xu, Guowen
    Li, Hongwei
    Ren, Hao
    Sun, Jianfei
    Xu, Shengmin
    Ning, Jianting
    Yang, Haomiao
    Yang, Kan
    Deng, Robert H.
    36TH ANNUAL COMPUTER SECURITY APPLICATIONS CONFERENCE (ACSAC 2020), 2020, : 784 - 797
  • [30] SecureBiNN: 3-Party Secure Computation for Binarized Neural Network Inference
    Zhu, Wenxing
    Wei, Mengqi
    Li, Xiangxue
    Li, Qiang
    COMPUTER SECURITY - ESORICS 2022, PT III, 2022, 13556 : 275 - 294