Hypersphere anchor loss for K-Nearest neighbors

被引:0
|
作者
Ye, Xiang [1 ]
He, Zihang [1 ]
Wang, Heng [1 ]
Li, Yong [1 ]
机构
[1] Beijing Univ Posts & Commun, Sch Elect Engn, Beijing 100876, Peoples R China
基金
中国国家自然科学基金;
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function; CLASSIFICATION;
D O I
10.1007/s10489-023-05148-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in R-n feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in R-n feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%, and the computational cost decreases to less than 10%.
引用
下载
收藏
页码:30319 / 30328
页数:10
相关论文
共 50 条
  • [21] AutoML for Stream k-Nearest Neighbors Classification
    Bahri, Maroua
    Veloso, Bruno
    Bifet, Albert
    Gama, Joao
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 597 - 602
  • [22] An Interval Valued K-Nearest Neighbors Classifier
    Derrac, Joaquin
    Chiclana, Francisco
    Garcia, Salvador
    Herrera, Francisco
    PROCEEDINGS OF THE 2015 CONFERENCE OF THE INTERNATIONAL FUZZY SYSTEMS ASSOCIATION AND THE EUROPEAN SOCIETY FOR FUZZY LOGIC AND TECHNOLOGY, 2015, 89 : 378 - 384
  • [23] Ensembles of K-Nearest Neighbors and Dimensionality Reduction
    Okun, Oleg
    Priisalu, Helen
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 2032 - +
  • [24] Forecasting Earnings Using k-Nearest Neighbors
    Easton, Peter D.
    Kapons, Martin M.
    Monahan, Steven J.
    Schutt, Harm H.
    Weisbrod, Eric H.
    ACCOUNTING REVIEW, 2024, 99 (03): : 115 - 140
  • [25] Heuristics for Computing k-Nearest Neighbors Graphs
    Chavez, Edgar
    Luduena, Veronica
    Reyes, Nora
    COMPUTER SCIENCE - CACIC 2019, 2020, 1184 : 234 - 249
  • [26] Distributed architecture for k-nearest neighbors recommender systems
    Formoso, Vreixo
    Fernandez, Diego
    Cacheda, Fidel
    Carneiro, Victor
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2015, 18 (04): : 997 - 1017
  • [27] Parallel Search of k-Nearest Neighbors with Synchronous Operations
    Sismanis, Nikos
    Pitsianis, Nikos
    Sun, Xiaobai
    2012 IEEE CONFERENCE ON HIGH PERFORMANCE EXTREME COMPUTING (HPEC), 2012,
  • [28] Human Sleep Scoring Based on K-Nearest Neighbors
    Qureshi, Shahnawaz
    Karrila, Seppo
    Vanichayobon, Sirirut
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2018, 26 (06) : 2802 - +
  • [29] Compressed kNN: K-Nearest Neighbors with Data Compression
    Salvador-Meneses, Jaime
    Ruiz-Chavez, Zoila
    Garcia-Rodriguez, Jose
    ENTROPY, 2019, 21 (03)
  • [30] A hashing strategy for efficient k-nearest neighbors computation
    Vanco, M
    Brunnett, G
    Schreiber, T
    COMPUTER GRAPHICS INTERNATIONAL, PROCEEDINGS, 1999, : 120 - 128