On the Benefits of Progressively Increasing Sampling Sizes in Stochastic Greedy Weak Submodular Maximization

被引:3
|
作者
Hashemi, Abolfazl [1 ]
Vikalo, Haris [2 ]
de Veciana, Gustavo [2 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
[2] Univ Texas Austin, Dept Elect & Comp Engn, Austin, TX 78712 USA
关键词
Weak submodular optimization; greedy algorithms; randomized algorithms; subset selection; ORTHOGONAL MATCHING PURSUIT; PERFORMANCE EVALUATION; SENSOR SELECTION; SIGNAL RECOVERY; LEAST-SQUARES;
D O I
10.1109/TSP.2022.3195089
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Many problems in signal processing and machine learning can be formalized as weak submodular optimization tasks. For such problems, a simple greedy algorithm (GREEDY) is guaranteed to find a solution achieving the objective with a value no worse than 1 - e(-1/c) of the optimal, where c is the multiplicative weak-submodularity constant. Due to the high cast of querying large-scale systems, the complexity of GREEDY becomes prohibitive in contemporary applications. In this work, we study the tradeoff between performance and complexity when one resorts to random sampling strategies to reduce the query complexity of GREEDY. Specifically, we quantify the effect of uniform sampling strategies on GREEDY's performance through two metrics: (i) asymptotic probability of identifying an optimal subset, and (ii) suboptimality with respect to the optimal solution. The latter implies that uniform sampling strategies with a fixed sampling size achieve a non-trivial approximation factor; however, we show that with overwhelming probability, these methods fail to find the optimal subset. Our analysis shows that the failure of uniform sampling strategies with fixed sample size can be circumvented by successively increasing the size of the search space. Building upon this insight, we propose a simple progressive stochastic greedy algorithm and study its approximation guarantees. Moreover, we demonstrate effectiveness of the proposed method in dimensionality reduction applications and feature selection tasks for clustering and object tracking.
引用
收藏
页码:3978 / 3992
页数:15
相关论文
共 5 条
  • [1] Learning automata-accelerated greedy algorithms for stochastic submodular maximization
    Di, Chong
    Li, Fangqi
    Xu, Pengyao
    Guo, Ying
    Chen, Chao
    Shu, Minglei
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 282
  • [2] STOCHASTIC-LAZIER-GREEDY ALGORITHM FOR MONOTONE NON-SUBMODULAR MAXIMIZATION
    Han, Lu
    Li, Min
    Xu, Dachuan
    Zhang, Dongmei
    [J]. JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2021, 17 (05) : 2607 - 2614
  • [3] ON THE PERFORMANCE-COMPLEXITY TRADEOFF IN STOCHASTIC GREEDY WEAK SUBMODULAR OPTIMIZATION
    Hashemi, Abolfazl
    Vikalo, Haris
    de Veciana, Gustavo
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3540 - 3544
  • [4] Greedy is Good: Constrained Non-submodular Function Maximization via Weak Submodularity
    Shi, Ma-Jun
    Wang, Wei
    [J]. JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2024, 12 (03) : 627 - 648
  • [5] Guarantees of Stochastic Greedy Algorithms for Non-monotone Submodular Maximization with Cardinality Constraint
    Sakaue, Shinsaku
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108