Smooth approximation method for non-smooth empirical risk minimization based distance metric learning

被引:3
|
作者
Shi, Ya [1 ]
Ji, Hongbing [1 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710071, Peoples R China
关键词
Distance metric learning; Empirical risk minimization; Smooth approximation; Nesterov's optimal first-order method;
D O I
10.1016/j.neucom.2013.08.030
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distance metric learning (DML) has become a very active research field in recent years. Bian and Tao (IEEE Trans. Neural Netw. Learn. Syst. 23(8) (2012) 1194-1205) presented a constrained empirical risk minimization (ERM) framework for DML. In this paper, we utilize smooth approximation method to make their algorithm applicable to the non-differentiable hinge loss function. We show that the objective function with hinge loss is equivalent to a non-smooth min-max representation, from which an approximate objective function is derived. Compared to the original objective function, the approximate one becomes differentiable with Lipschitz-continuous gradient. Consequently, Nesterov's optimal first-order method can be directly used. Finally, the effectiveness of our method is evaluated on various UCI datasets. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:135 / 143
页数:9
相关论文
共 50 条
  • [31] A Non-smooth Newton Method for Multibody Dynamics
    Erleben, K.
    Ortiz, R.
    NUMERICAL ANALYSIS AND APPLIED MATHEMATICS, 2008, 1048 : 178 - +
  • [32] Supervised learning as an inverse problem based on non-smooth loss function
    Lyaqini, Soufiane
    Quafafou, Mohamed
    Nachaoui, Mourad
    Chakib, Abdelkrim
    KNOWLEDGE AND INFORMATION SYSTEMS, 2020, 62 (08) : 3039 - 3058
  • [33] Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms
    Peter Ochs
    Jalal Fadili
    Thomas Brox
    Journal of Optimization Theory and Applications, 2019, 181 : 244 - 278
  • [34] Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms
    Ochs, Peter
    Fadili, Jalal
    Brox, Thomas
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 181 (01) : 244 - 278
  • [35] Relaxed Majorization-Minimization for Non-Smooth and Non-Convex Optimization
    Xu, Chen
    Lin, Zhouchen
    Zhao, Zhenyu
    Zha, Hongbin
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 812 - 818
  • [36] Supervised learning as an inverse problem based on non-smooth loss function
    Soufiane Lyaqini
    Mohamed Quafafou
    Mourad Nachaoui
    Abdelkrim Chakib
    Knowledge and Information Systems, 2020, 62 : 3039 - 3058
  • [37] A Stochastic Subgradient Method for Distributionally Robust Non-convex and Non-smooth Learning
    Mert Gürbüzbalaban
    Andrzej Ruszczyński
    Landi Zhu
    Journal of Optimization Theory and Applications, 2022, 194 : 1014 - 1041
  • [38] APPROXIMATION OF EIGENVALUES OF DIFFERENTIAL-EQUATIONS WITH NON-SMOOTH COEFFICIENTS
    BANERJEE, U
    ESAIM-MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS-MODELISATION MATHEMATIQUE ET ANALYSE NUMERIQUE, 1988, 22 (01): : 29 - 51
  • [39] A BUNDLE TYPE APPROACH TO THE UNCONSTRAINED MINIMIZATION OF CONVEX NON-SMOOTH FUNCTIONS
    GAUDIOSO, M
    MONACO, MF
    MATHEMATICAL PROGRAMMING, 1982, 23 (02) : 216 - 226
  • [40] NECESSARY AND SUFFICIENT OPTIMALITY CONDITIONS FOR A CLASS OF NON-SMOOTH MINIMIZATION PROBLEMS
    BENTAL, A
    ZOWE, J
    MATHEMATICAL PROGRAMMING, 1982, 24 (01) : 70 - 91