Hypersphere anchor loss for K-Nearest neighbors

被引:0
|
作者
Xiang Ye
Zihang He
Heng Wang
Yong Li
机构
[1] Beijing University of Posts and Communication,School of Electronic Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
K-Nearest neighbors; Convolutional neural network; Image classification; Loss function;
D O I
暂无
中图分类号
学科分类号
摘要
Learning effective feature spaces for KNN (K-Nearest Neighbor) classifiers is critical for their performance. Existing KNN loss functions designed to optimize CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces for specific KNN classifiers greatly boost the performance. However, these loss functions need to compute the pairwise distances within each batch, which requires large computational resource, GPU and memory. This paper aims to exploit lightweight KNN loss functions in order to reduce the computational cost while achieving comparable to or even better performance than existing KNN loss functions. To this end, an anchor loss function is proposed that assigns each category an anchor vector in KNN feature spaces and introduces the distances between training samples and anchor vectors in the NCA (Neighborhood Component Analysis) function. The proposed anchor loss function largely reduces the required computation by existing KNN loss functions. In addition, instead of optimizing CNNs in Rn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {R}^n$$\end{document} feature spaces, this paper proposed to optimize them in hypersphere feature spaces for faster convergence and better performance. The proposed anchor loss optimized in the hypersphere feature space is called HAL (Hypersphere Anchor Loss). Experiments on various image classification benchmarks show that HAL reduces the computational cost and achieves better performance: on CIFAR-10 and Fashion-MNIST datasets, compared with existing KNN loss functions, HAL improves the accuracy by over 1%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1\%$$\end{document}, and the computational cost decreases to less than 10%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$10\%$$\end{document}.
引用
收藏
页码:30319 / 30328
页数:9
相关论文
共 50 条
  • [1] Hypersphere anchor loss for K-Nearest neighbors
    Ye, Xiang
    He, Zihang
    Wang, Heng
    Li, Yong
    APPLIED INTELLIGENCE, 2023, 53 (24) : 30319 - 30328
  • [2] K-Nearest Neighbors Hashing
    He, Xiangyu
    Wang, Peisong
    Cheng, Jian
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2834 - 2843
  • [3] Modernizing k-nearest neighbors
    Elizabeth Yancey, Robin
    Xin, Bochao
    Matloff, Norm
    STAT, 2021, 10 (01):
  • [4] METHOD FOR DETERMINING K-NEAREST NEIGHBORS
    KITTLER, J
    KYBERNETES, 1978, 7 (04) : 313 - 315
  • [5] K-nearest neighbors in uncertain graph
    Zhang, Yinglong
    Li, Cuiping
    Chen, Hong
    Du, Lingxia
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2011, 48 (10): : 1850 - 1858
  • [6] Classification with learning k-nearest neighbors
    Laaksonen, J
    Oja, E
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 1480 - 1483
  • [7] K-nearest neighbors clustering algorithm
    Gauza, Dariusz
    Zukowska, Anna
    Nowak, Robert
    PHOTONICS APPLICATIONS IN ASTRONOMY, COMMUNICATIONS, INDUSTRY, AND HIGH-ENERGY PHYSICS EXPERIMENTS 2014, 2014, 9290
  • [8] Hausdorff Distance with k-Nearest Neighbors
    Wang, Jun
    Tan, Ying
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2012, PT II, 2012, 7332 : 272 - 281
  • [9] k-Nearest Neighbors in Uncertain Graphs
    Potamias, Michalis
    Bonchi, Francesco
    Gionis, Aristides
    Kollios, George
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2010, 3 (01): : 997 - 1008
  • [10] Ultrahigh frequency path loss prediction based on K-nearest neighbors
    Tikaria, Mamta
    Nigam, Vineeta Saxena
    INTERNATIONAL JOURNAL OF MICROWAVE AND WIRELESS TECHNOLOGIES, 2024,