Residual k-Nearest Neighbors Label Distribution Learning

被引:1
|
作者
Wang, Jing [1 ,2 ]
Feng, Fu [1 ,2 ]
Lv, Jianhui [3 ]
Geng, Xin
机构
[1] Southeast Univ, Sch Comp Sci & Engn, Nanjing 210096, Peoples R China
[2] Southeast Univ, Key Lab New Generat Artificial Intelligence Techno, Minist Educ, Nanjing, Peoples R China
[3] Jinzhou Med Univ, Affiliated Hosp 1, Jinzhou 121012, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Label Distribution Learning (LDL); Label ambiguity; Generalization; Manifold; Neighborhood;
D O I
10.1016/j.patcog.2024.111006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label Distribution Learning (LDL) is a novel learning paradigm that assigns label distribution to each instance. It aims to learn the label distribution of training instances and predict unknown ones. Algorithm Adaptation (AA)-kNN is one of the most representative LDL baselines that adapts the kNN algorithm to LDL. Its generalization risk has been proven to approach zero if given infinite training data. Despite its theoretical advantage, AA-kNN generally performs poorly because real-world LDL problems only have finite or small training data. In this paper, we improve AA-kNN and propose a novel method called Residual k-Nearest Neighbors Label Distribution Learning (RkNN-LDL). First, RkNN-LDL introduces residual label distribution learning. Second, RkNN-LDL exploits the neighborhood structure of label distribution. In theoretical analysis, we prove that RkNN-LDL has a tighter generalization bound than AA-kNN. Besides, extensive experiments validate that RkNN-LDL beats several state-of-the-art LDL methods and statistically outperforms AA-kNN.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] Forecasting Earnings Using k-Nearest Neighbors
    Easton, Peter D.
    Kapons, Martin M.
    Monahan, Steven J.
    Schutt, Harm H.
    Weisbrod, Eric H.
    ACCOUNTING REVIEW, 2024, 99 (03): : 115 - 140
  • [32] Heuristics for Computing k-Nearest Neighbors Graphs
    Chavez, Edgar
    Luduena, Veronica
    Reyes, Nora
    COMPUTER SCIENCE - CACIC 2019, 2020, 1184 : 234 - 249
  • [33] Ensembles of K-Nearest Neighbors and Dimensionality Reduction
    Okun, Oleg
    Priisalu, Helen
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 2032 - +
  • [34] An Interval Valued K-Nearest Neighbors Classifier
    Derrac, Joaquin
    Chiclana, Francisco
    Garcia, Salvador
    Herrera, Francisco
    PROCEEDINGS OF THE 2015 CONFERENCE OF THE INTERNATIONAL FUZZY SYSTEMS ASSOCIATION AND THE EUROPEAN SOCIETY FOR FUZZY LOGIC AND TECHNOLOGY, 2015, 89 : 378 - 384
  • [35] Hypersphere anchor loss for K-Nearest neighbors
    Ye, Xiang
    He, Zihang
    Wang, Heng
    Li, Yong
    APPLIED INTELLIGENCE, 2023, 53 (24) : 30319 - 30328
  • [36] Semi-Supervised Multi-label k-Nearest Neighbors Classification Algorithms
    de Lucena, Danilo C. G.
    Prudencio, Ricardo B. C.
    2015 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2015), 2015, : 49 - 54
  • [37] Hybrid Genetic Algorithm With k-Nearest Neighbors for Radial Distribution Network Reconfiguration
    Jo, Seungchan
    Oh, Jae-Young
    Lee, Jaeho
    Oh, Seokhwa
    Moon, Hee Seung
    Zhang, Chen
    Gadh, Rajit
    Yoon, Yong Tae
    IEEE TRANSACTIONS ON SMART GRID, 2024, 15 (03) : 2614 - 2624
  • [38] Unsupervised image clustering algorithm based on contrastive learning and K-nearest neighbors
    Zhang, Xiuling
    Wang, Shuo
    Wu, Ziyun
    Tan, Xiaofei
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (09) : 2415 - 2423
  • [39] Ensemble Learning with Extremely Randomized k-Nearest Neighbors for Accurate and Efficient Classification
    Saber, Abid
    Abbas, Moncef
    Fergani, Belkacem
    NEW GENERATION COMPUTING, 2025, 43 (01)
  • [40] Large-scale distance metric learning for k-nearest neighbors regression
    Nguyen, Bac
    Morell, Carlos
    De Baets, Bernard
    NEUROCOMPUTING, 2016, 214 : 805 - 814