A bi-level metric learning framework via self-paced learning weighting

被引:0
|
作者
Yan, Jing [1 ]
Wei, Wei [1 ]
Guo, Xinyao [1 ]
Dang, Chuangyin [2 ,3 ]
Liang, Jiye [1 ]
机构
[1] Shanxi Univ, Sch Comp & Informat Technol, Minist Educ, Key Lab Computat Intelligence & Chinese Informat P, Taiyuan, Shanxi, Peoples R China
[2] City Univ Hong Kong, Dept Syst Engn & Engn Management, Hong Kong, Peoples R China
[3] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Metric learning; Self -paced learning; Adaptive neighborhood; Weighting tuples;
D O I
10.1016/j.patcog.2023.109446
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distance metric learning (DML) has achieved great success in many real-world applications. However, most existing DML models characterize the quality of tuples on the tuple level while ignoring the an-chor level. Therefore, the models are less accurate to portray the quality of tuples and tend to be over -fitting when anchors are noisy samples. In this paper, we devise a bi-level metric learning framework (BMLF), which characterizes the quality of tuples more finely on both levels, enhancing the generaliza-tion performance of the DML model. Furthermore, we present an implementation of BMLF based on a self-paced learning regular term and design the corresponding optimization algorithm. By weighing tu-ples on the anchor level and training the model using tuples with higher weights preferentially, the side effect of low-quality noisy samples will be alleviated. We empirically demonstrate that the effectiveness and robustness of the proposed method outperform the state-of-the-art methods on several benchmark datasets.(c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] A bi-level formulation for multiple kernel learning via self-paced training
    Alavi, Fatemeh
    Hashemi, Sattar
    PATTERN RECOGNITION, 2022, 129
  • [2] Self-paced hierarchical metric learning (SPHML)
    Mohammed Al-taezi
    Pengfei Zhu
    Qinghua Hu
    Yu Wang
    Abdulrahman Al-badwi
    International Journal of Machine Learning and Cybernetics, 2021, 12 : 2529 - 2541
  • [3] Self-paced hierarchical metric learning (SPHML)
    Al-taezi, Mohammed
    Zhu, Pengfei
    Hu, Qinghua
    Wang, Yu
    Al-badwi, Abdulrahman
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (09) : 2529 - 2541
  • [4] Self-paced Learning Method with Adaptive Mixture Weighting
    Li H.
    Zhao Y.
    Gong M.-G.
    Wu Y.
    Liu J.-Y.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (05): : 2337 - 2349
  • [5] A Self-Paced Regularization Framework for Multilabel Learning
    Li, Changsheng
    Wei, Fan
    Yan, Junchi
    Zhang, Xiaoyu
    Liu, Qingshan
    Zha, Hongyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (06) : 2660 - 2666
  • [6] Improving Robustness for Tag Recommendation via Self-Paced Adversarial Metric Learning
    Fei, Zhengshun
    Chen, Jianxin
    Chen, Gui
    Xiang, Xinjian
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (03): : 4237 - 4261
  • [7] Ensemble Self-Paced Learning Based on Adaptive Mixture Weighting
    Liu, Liwen
    Wang, Zhong
    Bai, Jianbin
    Yang, Xiangfeng
    Yang, Yunchuan
    Zhou, Jianbo
    ELECTRONICS, 2022, 11 (19)
  • [8] On the effectiveness of self-paced learning
    Tullis, Jonathan G.
    Benjamin, Aaron S.
    JOURNAL OF MEMORY AND LANGUAGE, 2011, 64 (02) : 109 - 118
  • [9] Self-Paced Curriculum Learning
    Jiang, Lu
    Meng, Deyu
    Zhao, Qian
    Shan, Shiguang
    Hauptmann, Alexander G.
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2694 - 2700
  • [10] Self-Paced Learning with Diversity
    Jiang, Lu
    Meng, Deyu
    Yu, Shoou-, I
    Lan, Zhenzhong
    Shan, Shiguang
    Hauptmann, Alexander G.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27