A bi-level metric learning framework via self-paced learning weighting

被引:0
|
作者
Yan, Jing [1 ]
Wei, Wei [1 ]
Guo, Xinyao [1 ]
Dang, Chuangyin [2 ,3 ]
Liang, Jiye [1 ]
机构
[1] Shanxi Univ, Sch Comp & Informat Technol, Minist Educ, Key Lab Computat Intelligence & Chinese Informat P, Taiyuan, Shanxi, Peoples R China
[2] City Univ Hong Kong, Dept Syst Engn & Engn Management, Hong Kong, Peoples R China
[3] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Metric learning; Self -paced learning; Adaptive neighborhood; Weighting tuples;
D O I
10.1016/j.patcog.2023.109446
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distance metric learning (DML) has achieved great success in many real-world applications. However, most existing DML models characterize the quality of tuples on the tuple level while ignoring the an-chor level. Therefore, the models are less accurate to portray the quality of tuples and tend to be over -fitting when anchors are noisy samples. In this paper, we devise a bi-level metric learning framework (BMLF), which characterizes the quality of tuples more finely on both levels, enhancing the generaliza-tion performance of the DML model. Furthermore, we present an implementation of BMLF based on a self-paced learning regular term and design the corresponding optimization algorithm. By weighing tu-ples on the anchor level and training the model using tuples with higher weights preferentially, the side effect of low-quality noisy samples will be alleviated. We empirically demonstrate that the effectiveness and robustness of the proposed method outperform the state-of-the-art methods on several benchmark datasets.(c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Informed Pair Selection for Self-paced Metric Learning in Siamese Neural Networks
    Martin, Kyle
    Wiratunga, Nirmalie
    Massie, Stewart
    Clos, Jeremie
    ARTIFICIAL INTELLIGENCE XXXV (AI 2018), 2018, 11311 : 34 - 49
  • [32] Task-aware world model learning with meta weighting via bi-level optimization
    Yuan, Huining
    Dou, Hongkun
    Jiang, Xingyu
    Deng, Yue
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [33] Video-based Person Re-identification via Self-Paced Learning and Deep Reinforcement Learning Framework
    Ouyang, Deqiang
    Shao, Jie
    Zhang, Yonghui
    Yang, Yang
    Shen, Heng Tao
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 1562 - 1570
  • [34] PIXEL-LEVEL SELF-PACED LEARNING FOR SUPER-RESOLUTION
    Lin, Wei
    Gao, Junyu
    Wang, Qi
    Li, Xuelong
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 2538 - 2542
  • [35] Robust sparse coding via self-paced learning for data representation
    Feng, Xiaodong
    Wu, Sen
    INFORMATION SCIENCES, 2021, 546 : 448 - 468
  • [36] Provable Representation Learning for Imitation Learning via Bi-level Optimization
    Arora, Sanjeev
    Du, Simon S.
    Kakade, Sham
    Luo, Yuping
    Saunshi, Nikunj
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [37] A probabilistic interpretation of self-paced learning with applications to reinforcement learning
    Klink, Pascal
    Abdulsamad, Hany
    Belousov, Boris
    D'Eramo, Carlo
    Peters, Jan
    Pajarinen, Joni
    Journal of Machine Learning Research, 2021, 22
  • [38] Extreme Learning Machine for Supervised Classification with Self-paced Learning
    Li, Li
    Zhao, Kaiyi
    Li, Sicong
    Sun, Ruizhi
    Cai, Saihua
    NEURAL PROCESSING LETTERS, 2020, 52 (03) : 1723 - 1744
  • [39] The Efficacy of Self-Paced Study in Multitrial Learning
    de Jonge, Mario
    Tabbers, Huib K.
    Pecher, Diane
    Jang, Yoonhee
    Zeelenberg, Rene
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2015, 41 (03) : 851 - 858
  • [40] Self-paced deep clustering with learning loss
    Zhang, Kai
    Song, Chengyun
    Qiu, Lianpeng
    PATTERN RECOGNITION LETTERS, 2023, 171 : 8 - 14