Active Nearest Neighbor Regression Through Delaunay Refinement

被引:0
|
作者
Kravberg, Alexander [1 ]
Marchetti, Giovanni Luca [1 ]
Polianskii, Vladislav [1 ]
Varava, Anastasiia
Pokorny, Florian T. [1 ]
Kragic, Danica [1 ]
机构
[1] Royal Inst Technol KTH, Sch Elect Engn & Comp Sci, Stockholm, Sweden
基金
欧洲研究理事会; 瑞典研究理事会;
关键词
BIG DATA;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We introduce an algorithm for active function approximation based on nearest neighbor regression. Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value and select novel query points in a way that takes the geometry of the function graph into account. We consider the recent state-of-the-art active function approximator called DEFER, which is based on incremental rectangular partitioning of the space, as the main baseline. The ANNR addresses a number of limitations that arise from the space subdivision strategy used in DEFER. We provide a computationally efficient implementation of our method, as well as theoretical halting guarantees. Empirical results show that ANNR outperforms the baseline for both closed-form functions and real-world examples, such as gravitational wave parameter inference and exploration of the latent space of a generative model.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Active Nearest-Neighbor Learning in Metric Spaces
    Kontorovich, Aryeh
    Sabato, Sivan
    Urner, Ruth
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [42] Adaptive active learning through k-nearest neighbor optimized local density clustering
    Ji, Xia
    Ye, WanLi
    Li, XueJun
    Zhao, Peng
    Yao, Sheng
    APPLIED INTELLIGENCE, 2023, 53 (12) : 14892 - 14902
  • [43] Adaptive active learning through k-nearest neighbor optimized local density clustering
    Xia Ji
    WanLi Ye
    XueJun Li
    Peng Zhao
    Sheng Yao
    Applied Intelligence, 2023, 53 : 14892 - 14902
  • [44] Delaunay tessellation of proteins: Four body nearest-neighbor propensities of amino acid residues
    Singh, RK
    Tropsha, A
    Vaisman, II
    JOURNAL OF COMPUTATIONAL BIOLOGY, 1996, 3 (02) : 213 - 221
  • [45] The nearest neighbor
    Alt, H
    COMPUTATIONAL DISCRETE MATHEMATICS: ADVANCED LECTURES, 2001, 2122 : 13 - 24
  • [46] POINTWISE CONVERGENCE PROPERTIES OF THE K NEAREST NEIGHBOR REGRESSION FUNCTION ESTIMATOR
    COLLOMB, G
    COMPTES RENDUS HEBDOMADAIRES DES SEANCES DE L ACADEMIE DES SCIENCES SERIE A, 1979, 289 (03): : 245 - 247
  • [47] Twin neural network improved k-nearest neighbor regression
    Wetzel, Sebastian J.
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2024,
  • [48] On the nearest neighbor of the nearest neighbor in multidimensional continuous and quantized space
    Rovatti, Riccardo
    Mazzini, Gianluca
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2008, 54 (09) : 4069 - 4080
  • [49] Nearest Neighbor outperforms Kernel-Kernel Methods for Distribution Regression
    Ramazanli, Ilqar
    2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 1 - 6
  • [50] Nearest neighbor search with locally weighted linear regression for heartbeat classification
    Juyoung Park
    Md Zakirul Alam Bhuiyan
    Mingon Kang
    Junggab Son
    Kyungtae Kang
    Soft Computing, 2018, 22 : 1225 - 1236