Machine learning friendly set version of Johnson-Lindenstrauss lemma

被引:0
|
作者
Klopotek, Mieczyslaw A. [1 ]
机构
[1] Polish Acad Sci, Inst Comp Sci, Ul Jana Kazimierza 5, PL-01248 Warsaw, Poland
关键词
Johnson-Lindenstrauss lemma; Random projection; Sample distortion; Dimensionality reduction; Linear JL transform; k-means algorithm; Clusterability retention; RANDOM-PROJECTION; PROOF;
D O I
10.1007/s10115-019-01412-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The widely discussed and applied Johnson-Lindenstrauss (JL) Lemmahas an existential form saying that for each set of data points Q in n-dimensional space, there exists a transformation f into an n'-dimensional space (n' < n) such that for each pair u, v is an element of Q (1 - delta) parallel to u - v parallel to(2) <= parallel to f (u) - f (v)parallel to(2) <= (1 + delta)parallel to u - v parallel to(2) for a user-defined error parameter delta. Furthermore, it is asserted that with some finite probability the transformation f may be found as a random projection (with scaling) onto the n' dimensional subspace so that after sufficiently many repetitions of random projection, f will be found with user-defined success rate 1 - epsilon. In this paper, we make a novel use of the JL Lemma. We prove a theorem stating that we can choose the target dimensionality in a random projection-type JL linear transformation in such a way that with probability 1 - epsilon all of data points from Q fall into predefined error range d for any user-predefined failure probability epsilon when performing a single random projection. This result is important for applications such as data clustering where we want to have a priori dimensionality reducing transformation instead of attempting a (large) number of them, as with traditional Johnson-Lindenstrauss Lemma. Furthermore, we investigate an important issue whether or not the projection according to JL Lemma is really useful when conducting data processing, that is whether the solutions to the clustering in the projected space apply to the original space. In particular, we take a closer look at the k-means algorithm and prove that a good solution in the projected space is also a good solution in the original space. Furthermore, under proper assumptions local optima in the original space are also ones in the projected space. We investigate also a broader issue of preserving clusterability under JL Lemma projection. We define the conditions for which clusterability property of the original space is transmitted to the projected space, so that a broad class of clustering algorithms for the original space is applicable in the projected space.
引用
收藏
页码:1961 / 2009
页数:49
相关论文
共 50 条
  • [21] Optimal Bounds for Johnson-Lindenstrauss Transformations
    Burr, Michael
    Gao, Shuhong
    Knoll, Fiona
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 19
  • [22] An efficient superpostional quantum Johnson-Lindenstrauss lemma via unitary t-designs
    Sen, Pranab
    QUANTUM INFORMATION PROCESSING, 2021, 20 (09)
  • [23] NEW BOUNDS FOR CIRCULANT JOHNSON-LINDENSTRAUSS EMBEDDINGS
    Zhang, Hui
    Cheng, Lizhi
    COMMUNICATIONS IN MATHEMATICAL SCIENCES, 2014, 12 (04) : 695 - 705
  • [24] Almost optimal explicit Johnson-Lindenstrauss families
    Harvard University, Cambridge, MA, United States
    不详
    不详
    Lect. Notes Comput. Sci., (628-639):
  • [25] Weak*-sequential properties of Johnson-Lindenstrauss spaces
    Aviles, Antonio
    Martinez-Cervantes, Gonzalo
    Rodriguez, Jose
    JOURNAL OF FUNCTIONAL ANALYSIS, 2019, 276 (10) : 3051 - 3066
  • [26] An Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform
    Ailon, Nir
    Liberty, Edo
    ACM TRANSACTIONS ON ALGORITHMS, 2013, 9 (03)
  • [27] Dimensionality reduction: beyond the Johnson-Lindenstrauss bound
    Bartal, Yair
    Recht, Ben
    Schulman, Leonard J.
    PROCEEDINGS OF THE TWENTY-SECOND ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2011, : 868 - 887
  • [28] Approximate Euclidean lengths and distances beyond Johnson-Lindenstrauss
    Sobczyk, Aleksandros
    Luisier, Mathieu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [29] Faster Johnson-Lindenstrauss transforms via Kronecker products
    Jin, Ruhui
    Kolda, Tamara G.
    Ward, Rachel
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2021, 10 (04) : 1533 - 1562
  • [30] The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy
    Blocki, Jeremiah
    Blum, Avrim
    Datta, Anupam
    Sheffet, Or
    2012 IEEE 53RD ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2012, : 410 - 419