Learning a Mahalanobis metric from equivalence constraints

被引:0
|
作者
Bar-Hillel, AB [1 ]
Hertz, T
Shental, N
Weinshall, D
机构
[1] Hebrew Univ Jerusalem, Sch Engn & Comp Sci, IL-91904 Jerusalem, Israel
[2] Hebrew Univ Jerusalem, Ctr Neural Computat, IL-91904 Jerusalem, Israel
关键词
clustering; metric learning; dimensionality reduction; equivalence constraints; side information;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many learning algorithms use a metric defined over the input space as a principal tool, and their performance critically depends on the quality of this metric. We address the problem of learning metrics using side-information in the form of equivalence constraints. Unlike labels, we demonstrate that this type of side-information can sometimes be automatically obtained without the need of human intervention. We show how such side-information can be used to modify the representation of the data, leading to improved clustering and classification. Specifically, we present the Relevant Component Analysis (RCA) algorithm, which is a simple and efficient algorithm for learning a Mahalanobis metric. We show that RCA is the solution of an interesting optimization problem, founded on an information theoretic basis. If dimensionality reduction is allowed within RCA, we show that it is optimally accomplished by a version of Fisher's linear discriminant that uses constraints. Moreover, under certain Gaussian assumptions, RCA can be viewed as a Maximum Likelihood estimation of the within class covariance matrix. We conclude with extensive empirical evaluations of RCA, showing its advantage over alternative methods.
引用
收藏
页码:937 / 965
页数:29
相关论文
共 50 条
  • [1] Large Scale Metric Learning from Equivalence Constraints
    Koestinger, Martin
    Hirzer, Martin
    Wohlhart, Paul
    Roth, Peter M.
    Bischof, Horst
    2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2012, : 2288 - 2295
  • [2] DISTANCE METRIC LEARNING BY QUADRATIC PROGRAMMING BASED ON EQUIVALENCE CONSTRAINTS
    Cevikalp, Hakan
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2012, 8 (10B): : 7017 - 7030
  • [3] Category learning from equivalence constraints
    Hammer, Rubi
    Hertz, Tomer
    Hochstein, Shaul
    Weinshall, Daphna
    COGNITIVE PROCESSING, 2009, 10 (03) : 211 - 232
  • [4] Category learning from equivalence constraints
    Hammer, R
    Hertz, T
    Hochstein, S
    Weinshall, D
    REVIEWS IN THE NEUROSCIENCES, 2005, 16 : S28 - S29
  • [5] Category learning from equivalence constraints
    Rubi Hammer
    Tomer Hertz
    Shaul Hochstein
    Daphna Weinshall
    Cognitive Processing, 2009, 10 : 211 - 232
  • [6] Beyond Mahalanobis Metric: Cayley-Klein Metric Learning
    Bi, Yanhong
    Fan, Bin
    Wu, Fuchao
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 2339 - 2347
  • [7] A Scalable Algorithm for Learning a Mahalanobis Distance Metric
    Kim, Junae
    Shen, Chunhua
    Wang, Lei
    COMPUTER VISION - ACCV 2009, PT III, 2010, 5996 : 299 - 310
  • [8] A boosting approach for supervised Mahalanobis distance metric learning
    Chang, Chin-Chun
    PATTERN RECOGNITION, 2012, 45 (02) : 844 - 862
  • [9] Learning a Mahalanobis distance metric for data clustering and classification
    Xiang, Shiming
    Nie, Feiping
    Zhang, Changshui
    PATTERN RECOGNITION, 2008, 41 (12) : 3600 - 3612
  • [10] Learning Mahalanobis Distance Metric: Considering Instance Disturbance Helps
    Ye, Han-Jia
    Zhan, De-Chuan
    Si, Xue-Min
    Jiang, Yuan
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3315 - 3321