Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence

被引:7
|
作者
Liang, Zhizheng [1 ]
Li, Youfu [2 ]
Xia, ShiXiong [1 ]
机构
[1] China Univ Min & Technol, Sch Comp Sci & Technol, Xuzhou, Peoples R China
[2] City Univ Hong Kong, Dept Mfg Engn & Engn Management, Hong Kong, Hong Kong, Peoples R China
关键词
Linear regression; KL divergence; Weighted learning; Alternative optimization; Image classification; FACE-RECOGNITION; CROSS-VALIDATION; ROBUST; REGULARIZATION; REPRESENTATION; COVARIANCE; MATRIX;
D O I
10.1016/j.patcog.2012.10.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics. (C) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1209 / 1219
页数:11
相关论文
共 50 条
  • [11] Ellipticity and Circularity Measuring via Kullback-Leibler Divergence
    Misztal, Krzysztof
    Tabor, Jacek
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2016, 55 (01) : 136 - 150
  • [12] ON WEIGHTED KULLBACK-LEIBLER DIVERGENCE FOR DOUBLY TRUNCATED RANDOM VARIABLES
    Moharana, Rajesh
    Kayal, Suchandan
    REVSTAT-STATISTICAL JOURNAL, 2019, 17 (03) : 297 - 320
  • [13] Blind Deblurring of Barcodes via Kullback-Leibler Divergence
    Rioux, Gabriel
    Scarvelis, Christopher
    Choksi, Rustum
    Hoheisel, Tim
    Marechal, Pierre
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (01) : 77 - 88
  • [14] Model parameter learning using Kullback-Leibler divergence
    Lin, Chungwei
    Marks, Tim K.
    Pajovic, Milutin
    Watanabe, Shinji
    Tung, Chih-kuan
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 491 : 549 - 559
  • [15] Fault tolerant learning using Kullback-Leibler divergence
    Sum, John
    Leung, Chi-sing
    Hsu, Lipin
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 1193 - +
  • [16] Unifying Computational Entropies via Kullback-Leibler Divergence
    Agrawal, Rohit
    Chen, Yi-Hsiu
    Horel, Thibaut
    Vadhan, Salil
    ADVANCES IN CRYPTOLOGY - CRYPTO 2019, PT II, 2019, 11693 : 831 - 858
  • [17] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [18] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [19] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [20] Kullback-Leibler divergence for evaluating bioequivalence
    Dragalin, V
    Fedorov, V
    Patterson, S
    Jones, B
    STATISTICS IN MEDICINE, 2003, 22 (06) : 913 - 930