Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence

被引:7
|
作者
Liang, Zhizheng [1 ]
Li, Youfu [2 ]
Xia, ShiXiong [1 ]
机构
[1] China Univ Min & Technol, Sch Comp Sci & Technol, Xuzhou, Peoples R China
[2] City Univ Hong Kong, Dept Mfg Engn & Engn Management, Hong Kong, Hong Kong, Peoples R China
关键词
Linear regression; KL divergence; Weighted learning; Alternative optimization; Image classification; FACE-RECOGNITION; CROSS-VALIDATION; ROBUST; REGULARIZATION; REPRESENTATION; COVARIANCE; MATRIX;
D O I
10.1016/j.patcog.2012.10.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics. (C) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1209 / 1219
页数:11
相关论文
共 50 条
  • [21] Statistical Estimation of the Kullback-Leibler Divergence
    Bulinski, Alexander
    Dimitrov, Denis
    MATHEMATICS, 2021, 9 (05) : 1 - 36
  • [22] Source Resolvability with Kullback-Leibler Divergence
    Nomura, Ryo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 2042 - 2046
  • [23] Kullback-Leibler divergence: A quantile approach
    Sankaran, P. G.
    Sunoj, S. M.
    Nair, N. Unnikrishnan
    STATISTICS & PROBABILITY LETTERS, 2016, 111 : 72 - 79
  • [24] A decision cognizant Kullback-Leibler divergence
    Ponti, Moacir
    Kittler, Josef
    Riva, Mateus
    de Campos, Teofilo
    Zor, Cemre
    PATTERN RECOGNITION, 2017, 61 : 470 - 478
  • [25] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [26] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235
  • [27] Efficient distributional reinforcement learning with Kullback-Leibler divergence regularization
    Li, Renxing
    Shang, Zhiwei
    Zheng, Chunhua
    Li, Huiyun
    Liang, Qing
    Cui, Yunduan
    APPLIED INTELLIGENCE, 2023, 53 (21) : 24847 - 24863
  • [28] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [29] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [30] Efficient distributional reinforcement learning with Kullback-Leibler divergence regularization
    Renxing Li
    Zhiwei Shang
    Chunhua Zheng
    Huiyun Li
    Qing Liang
    Yunduan Cui
    Applied Intelligence, 2023, 53 : 24847 - 24863