Residual k-Nearest Neighbors Label Distribution Learning

被引:1
|
作者
Wang, Jing [1 ,2 ]
Feng, Fu [1 ,2 ]
Lv, Jianhui [3 ]
Geng, Xin
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing 210096, Peoples R China
[2] Southeast Univ, Key Lab New Generat Artificial Intelligence Techno, Minist Educ, Nanjing, Peoples R China
[3] Jinzhou Med Univ, Affiliated Hosp 1, Jinzhou 121012, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Label Distribution Learning (LDL); Label ambiguity; Generalization; Manifold; Neighborhood;
D O I
10.1016/j.patcog.2024.111006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label Distribution Learning (LDL) is a novel learning paradigm that assigns label distribution to each instance. It aims to learn the label distribution of training instances and predict unknown ones. Algorithm Adaptation (AA)-kNN is one of the most representative LDL baselines that adapts the kNN algorithm to LDL. Its generalization risk has been proven to approach zero if given infinite training data. Despite its theoretical advantage, AA-kNN generally performs poorly because real-world LDL problems only have finite or small training data. In this paper, we improve AA-kNN and propose a novel method called Residual k-Nearest Neighbors Label Distribution Learning (RkNN-LDL). First, RkNN-LDL introduces residual label distribution learning. Second, RkNN-LDL exploits the neighborhood structure of label distribution. In theoretical analysis, we prove that RkNN-LDL has a tighter generalization bound than AA-kNN. Besides, extensive experiments validate that RkNN-LDL beats several state-of-the-art LDL methods and statistically outperforms AA-kNN.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Large Margin Weighted k-Nearest Neighbors Label Distribution Learning for Classification
    Wang, Jing
    Geng, Xin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16720 - 16732
  • [2] Large Margin Weighted k-Nearest Neighbors Label Distribution Learning for Classification
    Wang, Jing
    Geng, Xin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 35 (11) : 1 - 13
  • [3] Classification with learning k-nearest neighbors
    Laaksonen, J
    Oja, E
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 1480 - 1483
  • [4] Introduction to machine learning: k-nearest neighbors
    Zhang, Zhongheng
    ANNALS OF TRANSLATIONAL MEDICINE, 2016, 4 (11)
  • [5] K-Nearest Neighbors Hashing
    He, Xiangyu
    Wang, Peisong
    Cheng, Jian
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2834 - 2843
  • [6] Modernizing k-nearest neighbors
    Elizabeth Yancey, Robin
    Xin, Bochao
    Matloff, Norm
    STAT, 2021, 10 (01):
  • [7] Learning k-nearest neighbors classifier from distributed data
    Khedr, Ahmed M.
    COMPUTING AND INFORMATICS, 2008, 27 (03) : 355 - 376
  • [8] METHOD FOR DETERMINING K-NEAREST NEIGHBORS
    KITTLER, J
    KYBERNETES, 1978, 7 (04) : 313 - 315
  • [9] K-nearest neighbors in uncertain graph
    Zhang, Yinglong
    Li, Cuiping
    Chen, Hong
    Du, Lingxia
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2011, 48 (10): : 1850 - 1858
  • [10] K-nearest neighbors clustering algorithm
    Gauza, Dariusz
    Zukowska, Anna
    Nowak, Robert
    PHOTONICS APPLICATIONS IN ASTRONOMY, COMMUNICATIONS, INDUSTRY, AND HIGH-ENERGY PHYSICS EXPERIMENTS 2014, 2014, 9290