Efficient Task-Specific Data Valuation for Nearest Neighbor Algorithms

被引:81
|
作者
Jia, Ruoxi [1 ]
Dao, David [2 ]
Wang, Boxin [3 ]
Hubis, Frances Ann [2 ]
Gurel, Nezihe Merve [2 ]
Li, Bo [4 ]
Zhang, Ce [2 ]
Spanos, Costas J. [1 ]
Song, Dawn [1 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
[2] Swiss Fed Inst Technol, Zurich, Switzerland
[3] Zhejiang Univ, Hangzhou, Zhejiang, Peoples R China
[4] UIUC, Champaign, IL USA
来源
PROCEEDINGS OF THE VLDB ENDOWMENT | 2019年 / 12卷 / 11期
关键词
D O I
10.14778/3342263.3342637
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Given a data set D containing millions of data points and a data consumer who is willing to pay for $X to train a machine learning (ML) model over D, how should we distribute this $X to each data point to reflect its "value"? In this paper, we define the "relative value of data" via the Shapley value, as it uniquely possesses properties with appealing real-world interpretations, such as fairness, rationality and decentralizability. For general, bounded utility functions, the Shapley value is known to be challenging to compute: to get Shapley values for all N data points, it requires O(2(N)) model evaluations for exact computation and O(N log N) for (epsilon, delta)-approximation. In this paper, we focus on one popular family of ML models relying on K-nearest neighbors (KNN). The most surprising result is that for unweighted KNN classifiers and regressors, the Shapley value of all N data points can be computed, exactly, in O(N log N) time - an exponential improvement on computational complexity! Moreover, for (epsilon, delta)-approximation, we are able to develop an algorithm based on Locality Sensitive Hashing (LSH) with only sublinear complexity O (N-h(epsilon,N-K) log N) when is not too small and epsilon is not too large. We empirically evaluate our algorithms on up to 10 million data points and even our exact algorithm is up to three orders of magnitude faster than the baseline approximation algorithm. The LSH-based approximation algorithm can accelerate the value calculation process even further. We then extend our algorithm to other scenarios such as (1) weighed KNN classifiers, (2) different data points are clustered by different data curators, and (3) there are data analysts providing computation who also requires proper valuation. Some of these extensions, although also being improved exponentially, are less practical for exact computation (e.g., O (N-K) complexity for weigthed KNN). We thus propose an Monte Carlo approximation algorithm, which is O (N (log N)(2)/(log K)(2)) times more efficient than the baseline approximation algorithm.
引用
收藏
页码:1610 / 1623
页数:14
相关论文
共 50 条
  • [31] Efficient and secure k-nearest neighbor query on outsourced data
    Huijuan Lian
    Weidong Qiu
    Di Yan
    Zheng Huang
    Peng Tang
    Peer-to-Peer Networking and Applications, 2020, 13 : 2324 - 2333
  • [32] An efficient nearest neighbor search in high-dimensional data spaces
    Lee, DH
    Kim, HJ
    INFORMATION PROCESSING LETTERS, 2002, 81 (05) : 239 - 246
  • [33] Efficient and secure k-nearest neighbor query on outsourced data
    Lian, Huijuan
    Qiu, Weidong
    Yan, Di
    Huang, Zheng
    Tang, Peng
    PEER-TO-PEER NETWORKING AND APPLICATIONS, 2020, 13 (06) : 2324 - 2333
  • [34] Tandem fusion of nearest neighbor editing and condensing algorithms -: Data dimensionality effects
    Dasarathy, BV
    Sánchez, JS
    15TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 2, PROCEEDINGS: PATTERN RECOGNITION AND NEURAL NETWORKS, 2000, : 692 - 695
  • [35] Parallel Algorithms for Constructing Range and Nearest-Neighbor Searching Data Structures
    Agarwal, Pankaj K.
    Fox, Kyle
    Munagala, Kamesh
    Nath, Abhinandan
    PODS'16: PROCEEDINGS OF THE 35TH ACM SIGMOD-SIGACT-SIGAI SYMPOSIUM ON PRINCIPLES OF DATABASE SYSTEMS, 2016, : 429 - 440
  • [36] A nearest neighbor method for efficient ICP
    Greenspan, M
    Godin, G
    THIRD INTERNATIONAL CONFERENCE ON 3-D DIGITAL IMAGING AND MODELING, PROCEEDINGS, 2001, : 161 - 168
  • [37] Efficient implementation of nearest neighbor classification
    Herrero, JR
    Navarro, JJ
    Computer Recognition Systems, Proceedings, 2005, : 177 - 186
  • [38] An Efficient Pseudo Nearest Neighbor Classifier
    Chai, Zheng
    Li, Yanying
    Wang, Aili
    Li, Chen
    Zhang, Baoshuang
    Gong, Huanhuan
    Li, Yanying (liyanying2021@163.com), 2021, International Association of Engineers (48)
  • [39] Toward optimal ε-approximate nearest neighbor algorithms
    Cary, M
    JOURNAL OF ALGORITHMS-COGNITION INFORMATICS AND LOGIC, 2001, 41 (02): : 417 - 428
  • [40] Error Minimizing Algorithms for Nearest Neighbor Classifiers
    Porter, Reid B.
    Hush, Don
    Zimmer, G. Beate
    IMAGE PROCESSING: ALGORITHMS AND SYSTEMS IX, 2011, 7870