Logistic Boosting Regression for Label Distribution Learning

被引:55
|
作者
Xing, Chao [1 ]
Geng, Xin [1 ]
Xue, Hui [1 ]
机构
[1] Southeast Univ, Key Lab Comp Network & Informat Integrat, Minist Educ, Sch Comp Sci & Engn, Nanjing 211189, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
AGE ESTIMATION;
D O I
10.1109/CVPR.2016.486
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label Distribution Learning (LDL) is a general learning framework which includes both single label and multi-label learning as its special cases. One of the main assumptions made in traditional LDL algorithms is the derivation of the parametric model as the maximum entropy model. While it is a reasonable assumption without additional information, there is no particular evidence supporting it in the problem of LDL. Alternatively, using a general LDL model family to approximate this parametric model can avoid the potential influence of the specific model. In order to learn this general model family, this paper uses a method called Logistic Boosting Regression (LogitBoost) which can be seen as an additive weighted function regression from the statistical viewpoint. For each step, we can fit individual weighted regression function (base learner) to realize the optimization gradually. The base learners are chosen as weighted regression tree and vector tree, which constitute two algorithms named LDLogitBoost and AOSO-LDLogitBoost in this paper. Experiments on facial expression recognition, crowd opinion prediction on movies and apparent age estimation show that LDLogitBoost and AOSO-LDLogitBoost can achieve better performance than traditional LDL algorithms as well as other LogitBoost algorithms.
引用
收藏
页码:4489 / 4497
页数:9
相关论文
共 50 条
  • [1] Learning kernel logistic regression in the presence of class label noise
    Bootkrajang, Jakramate
    Kahan, Ata
    [J]. PATTERN RECOGNITION, 2014, 47 (11) : 3641 - 3655
  • [2] Learning a Label-Noise Robust Logistic Regression: Analysis and Experiments
    Bootkrajang, Jakramate
    Kaban, Ata
    [J]. INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2013, 2013, 8206 : 569 - 576
  • [3] Logistic regression and boosting for labeled bags of instances
    Xu, X
    Frank, E
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS, 2004, 3056 : 272 - 281
  • [4] Multi-label feature selection based on logistic regression and manifold learning
    Zhang, Yao
    Ma, Yingcang
    Yang, Xiaofei
    [J]. APPLIED INTELLIGENCE, 2022, 52 (08) : 9256 - 9273
  • [5] Multi-label feature selection based on logistic regression and manifold learning
    Yao Zhang
    Yingcang Ma
    Xiaofei Yang
    [J]. Applied Intelligence, 2022, 52 : 9256 - 9273
  • [6] Additive logistic regression: A statistical view of boosting - Rejoinder
    Friedman, J
    Hastie, T
    Tibshirani, R
    [J]. ANNALS OF STATISTICS, 2000, 28 (02): : 400 - 407
  • [7] Additive logistic regression: A statistical view of boosting - Discussion
    Breiman, L
    [J]. ANNALS OF STATISTICS, 2000, 28 (02): : 374 - 377
  • [8] Support vector machines, kernel logistic regression and boosting
    Zhu, J
    Hastie, R
    [J]. MULTIPLE CLASSIFIER SYSTEMS, 2002, 2364 : 16 - 26
  • [9] Logistic regression and world income distribution
    Alonso-Rodríguez A.
    [J]. International Advances in Economic Research, 2001, 7 (2) : 231 - 242
  • [10] TRANSFER LEARNING BASED ON LOGISTIC REGRESSION
    Paul, A.
    Rottensteiner, F.
    Heipke, C.
    [J]. ISPRS GEOSPATIAL WEEK 2015, 2015, 40-3 (W3): : 145 - 152