On a Connection Between Fast and Sparse Oblivious Subspace Embeddings

被引:0
|
作者
Wang, Rui [1 ]
Xu, Wangli
机构
[1] Renmin Univ China, Ctr Appl Stat, Beijing 100872, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
JOHNSON-LINDENSTRAUSS; ALGORITHMS; REDUCTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fast Johnson-Lindenstrauss Transform (FJLT) and Sparse Johnson-Lindenstrauss Transform (SJLT) are two important oblivious subspace embeddings. So far, the developments of these two methods are almost orthogonal. In this work, we propose an iterative algorithm for oblivious subspace embedding which makes a connection between these two methods. The proposed method is built upon an iterative implementation of FJLT and is equipped with several theoretically motivated modifications. One important strategy we adopt is the early stopping strategy. On the one hand, the early stopping strategy makes our algorithm fast. On the other hand, it results in a sparse embedding matrix. As a result, the proposed algorithm is not only faster than the FJLT, but also faster than the SJLT with the same degree of sparsity. We present a general theoretical framework to analyze the embedding property of sparse embedding methods, which is used to prove the embedding property of the proposed method. This framework is also of independent interest. Lastly, we conduct numerical experiments to verify the good performance of the proposed algorithm.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] Lower Bounds for Sparse Oblivious Subspace Embeddings
    Li, Yi
    Liu, Mingmou
    [J]. PROCEEDINGS OF THE 41ST ACM SIGMOD-SIGACT-SIGAI SYMPOSIUM ON PRINCIPLES OF DATABASE SYSTEMS (PODS '22), 2022, : 251 - 260
  • [2] Lower Bounds for Oblivious Subspace Embeddings
    Nelson, Jelani
    Nguyen, Huy L.
    [J]. AUTOMATA, LANGUAGES, AND PROGRAMMING (ICALP 2014), PT I, 2014, 8572 : 883 - 894
  • [3] Tight Bounds for l1 Oblivious Subspace Embeddings
    Wang, Ruosong
    Woodruff, David P.
    [J]. ACM TRANSACTIONS ON ALGORITHMS, 2022, 18 (01)
  • [4] Optimal Embedding Dimension for Sparse Subspace Embeddings
    Chenakkod, Shabarish
    Derezinski, Michal
    Dong, Xiaoyu
    Rudelson, Mark
    [J]. PROCEEDINGS OF THE 56TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING, STOC 2024, 2024, : 1106 - 1117
  • [5] Restricted Connection Orthogonal Matching Pursuit for Sparse Subspace Clustering
    Zhu, Wenqi
    Yang, Shuai
    Zhu, Yuesheng
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (12) : 1892 - 1896
  • [6] LOWER MEMORY OBLIVIOUS (TENSOR) SUBSPACE EMBEDDINGS WITH FEWER RANDOM BITS: MODEWISE METHODS FOR LEAST SQUARES
    Iwen, Mark A.
    Needell, Deanna
    Rebrova, Elizaveta
    Zare, Ali
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2021, 42 (01) : 376 - 416
  • [7] Fast and Accurate Network Embeddings via Very Sparse Random Projection
    Chen, Haochen
    Sultan, Syed Fahad
    Tian, Yingtao
    Chen, Muhao
    Skiena, Steven
    [J]. PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 399 - 408
  • [8] Oblivious String Embeddings and Edit Distance Approximations
    Batu, Tugkan
    Ergun, Funda
    Sahinalp, Cenk
    [J]. PROCEEDINGS OF THE SEVENTHEENTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2006, : 792 - 801
  • [9] Subspace Embeddings for the Polynomial Kernel
    Avron, Haim
    Nguyen, Huy L.
    Woodruff, David P.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [10] Fast deflation sparse principal component analysis via subspace projections
    Xu, Cong
    Yang, Min
    Zhang, Jin
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (08) : 1399 - 1412