Exploring the impact of post-training rounding in regression models

被引:0
|
作者
Kalina, Jan [1 ]
机构
[1] Czech Acad Sci, Inst Comp Sci, Vodarenskou vezi 271-2, Prague 8, Czech Republic
关键词
supervised learning; trained model; perturbations; effect of rounding; low-precision arithmetic; CONSISTENCY; ROBUST;
D O I
10.21136/AM.2024.0090-23
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Post-training rounding, also known as quantization, of estimated parameters stands as a widely adopted technique for mitigating energy consumption and latency in machine learning models. This theoretical endeavor delves into the examination of the impact of rounding estimated parameters in key regression methods within the realms of statistics and machine learning. The proposed approach allows for the perturbation of parameters through an additive error with values within a specified interval. This method is elucidated through its application to linear regression and is subsequently extended to encompass radial basis function networks, multilayer perceptrons, regularization networks, and logistic regression, maintaining a consistent approach throughout.
引用
收藏
页码:257 / 271
页数:15
相关论文
共 50 条
  • [1] Exploring the impact of post-training rounding in regression models
    Jan Kalina
    [J]. Applications of Mathematics, 2024, 69 : 257 - 271
  • [2] Post-training Quantization on Diffusion Models
    Shang, Yuzhang
    Yuan, Zhihang
    Xie, Bin
    Wu, Bingzhe
    Yan, Yan
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 1972 - 1981
  • [3] Exploring the Data Efficiency of Cross-Lingual Post-Training in Pretrained Language Models
    Lee, Chanhee
    Yang, Kisu
    Whang, Taesun
    Park, Chanjun
    Matteson, Andrew
    Lim, Heuiseok
    [J]. APPLIED SCIENCES-BASEL, 2021, 11 (05): : 1 - 15
  • [4] Post-training Quantization Methods for Deep Learning Models
    Kluska, Piotr
    Zieba, Maciej
    [J]. INTELLIGENT INFORMATION AND DATABASE SYSTEMS (ACIIDS 2020), PT I, 2020, 12033 : 467 - 479
  • [5] PTQD: Accurate Post-Training Quantization for Diffusion Models
    He, Yefei
    Liu, Luping
    Liu, Jing
    Wu, Weijia
    Zhou, Hong
    Zhuang, Bohan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [6] WITH POST-TRAINING FOR THE QUALITY OF FOODS
    SZALOCZY, P
    [J]. ELELMEZESI IPAR, 1980, 34 (04): : B4 - B4
  • [7] EDUCATION AND POST-TRAINING IN BIOTECHNOLOGY
    HOLLO, J
    BIACS, P
    [J]. ELELMEZESI IPAR, 1981, 35 (04): : 126 - 128
  • [8] Post-training discriminative pruning for RBMs
    Sanchez-Gutierrez, Maximo
    Albornoz, Enrique M.
    Rufiner, Hugo L.
    Goddard Close, John
    [J]. SOFT COMPUTING, 2019, 23 (03) : 767 - 781
  • [9] SUCCESSFUL POST-TRAINING SKILL APPLICATION
    FELDMAN, M
    [J]. TRAINING AND DEVELOPMENT JOURNAL, 1981, 35 (09): : 72 - &
  • [10] Post-training discriminative pruning for RBMs
    Máximo Sánchez-Gutiérrez
    Enrique M. Albornoz
    Hugo L. Rufiner
    John Goddard Close
    [J]. Soft Computing, 2019, 23 : 767 - 781