Regularized margin-based conditional log-likelihood loss for prototype learning

被引:60
|
作者
Jin, Xiao-Bo [1 ]
Liu, Cheng-Lin [1 ]
Hou, Xinwen [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Prototype learning; Conditional log-likelihood loss; Log-likelihood of margin (LOGM); Regularization; Distance metric learning; VECTOR QUANTIZATION; CLASSIFICATION; CLASSIFIERS; ALGORITHMS; LVQ;
D O I
10.1016/j.patcog.2010.01.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The classification performance of nearest prototype classifiers largely relies on the prototype learning algorithm. The minimum classification error (MCE) method and the soft nearest prototype classifier (SNPC) method are two important algorithms using misclassification loss. This paper proposes a new prototype learning algorithm based on the conditional log-likelihood loss (CLL), which is based on the discriminative model called log-likelihood of margin (LOGM). A regularization term is added to avoid over-fitting in training as well as to maximize the hypothesis margin. The CLL in the LOGM algorithm is a convex function of margin, and so, shows better convergence than the MCE. In addition, we show the effects of distance metric learning with both prototype-dependent weighting and prototype-independent weighting. Our empirical study on the benchmark datasets demonstrates that the LOGM algorithm yields higher classification accuracies than the MCE, generalized learning vector quantization (GLVQ), soft nearest prototype classifier (SNPC) and the robust soft learning vector quantization (RSLVQ), and moreover, the LOGM with prototype-dependent weighting achieves comparable accuracies to the support vector machine (SVM) classifier. Crown Copyright (C) 2010 Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:2428 / 2438
页数:11
相关论文
共 50 条
  • [1] Prototype Learning with Margin-Based Conditional Log-likelihood Loss
    Jin, Xiaobo
    Liu, Cheng-Lin
    Hou, Xinwen
    [J]. 19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 22 - 25
  • [2] Discriminative Learning of Bayesian Networks via Factorized Conditional Log-Likelihood
    Carvalho, Alexandra M.
    Roos, Teemu
    Oliveira, Arlindo L.
    Myllymaki, Petri
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 2181 - 2210
  • [3] Deformation of log-likelihood loss function for multiclass boosting
    Kanamori, Takafumi
    [J]. NEURAL NETWORKS, 2010, 23 (07) : 843 - 864
  • [4] Hebbian Descent: A Unified View on Log-Likelihood Learning
    Melchior, Jan
    Schiewer, Robin
    Wiskott, Laurenz
    [J]. NEURAL COMPUTATION, 2024, 36 (09) : 1669 - 1712
  • [5] Learning Deep Embeddings via Margin-Based Discriminate Loss
    Sun, Peng
    Tang, Wenzhong
    Bai, Xiao
    [J]. STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2018, 2018, 11004 : 107 - 115
  • [6] Margin-Based Transfer Learning
    Su, Bai
    Xu, Wei
    Shen, Yidong
    [J]. SIXTH INTERNATIONAL SYMPOSIUM ON NEURAL NETWORKS (ISNN 2009), 2009, 56 : 223 - +
  • [7] Median Based Adaptive Quantization of Log-Likelihood Ratios
    Liu, Xiaoran
    Wang, Jian
    Gu, Fanglin
    Xiong, Jun
    Wei, Jibo
    [J]. 2018 IEEE 87TH VEHICULAR TECHNOLOGY CONFERENCE (VTC SPRING), 2018,
  • [8] Generalized selection combining based on the log-likelihood ratio
    Kim, SW
    Kim, YG
    Simon, MK
    [J]. 2003 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, VOLS 1-5: NEW FRONTIERS IN TELECOMMUNICATIONS, 2003, : 2789 - 2794
  • [9] A Log-Likelihood Ratio based Generalized Belief Propagation
    Amaricai, Alexandru
    Bahrami, Mohsem
    Vasic, Bane
    [J]. PROCEEDINGS OF 18TH INTERNATIONAL CONFERENCE ON SMART TECHNOLOGIES (IEEE EUROCON 2019), 2019,
  • [10] Generalized selection combining based on the log-likelihood ratio
    Kim, SW
    Kim, YG
    Simon, MK
    [J]. IEEE TRANSACTIONS ON COMMUNICATIONS, 2004, 52 (04) : 521 - 524