Exploring high-dimensional optimization by sparse and low-rank evolution strategy

被引:0
|
作者
Li, Zhenhua [1 ,2 ]
Wu, Wei [1 ]
Zhang, Qingfu [3 ]
Cai, Xinye [4 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Peoples R China
[2] MIIT Key Lab Pattern Anal & Machine Intelligence, Nanjing, Peoples R China
[3] City Univ Hong Kong, Dept Comp Sci, Hong Kong 999077, Peoples R China
[4] Dalian Univ Technol, Sch Control Sci & Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
Black-box optimization; Large-scale optimization; Evolution strategies; Sparse plus low-rank model; LOCAL SEARCH; SCALE; ADAPTATION; CMA;
D O I
10.1016/j.swevo.2024.101828
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Evolution strategies (ESs) area robust family of algorithms for black-box optimization, yet their applicability to high-dimensional problems remains constrained by computational challenges. To address this, we propose a novel evolution strategy, SLR-ES, leveraging a sparse plus low-rank covariance matrix model. The sparse component utilizes a diagonal matrix to exploit separability along coordinate axes, while the low-rank component identifies promising subspaces and parameter dependencies. To maintain distribution fidelity, we introduce a decoupled update mechanism for the model parameters. Comprehensive experiments demonstrate that SLR-ES achieves state-of-the-art performance on both separable and non-separable functions. Furthermore, evaluations on the CEC'2010 and CEC'2013 large-scale global optimization benchmarks reveal consistent superiority in average ranking, highlighting the algorithm's robustness across diverse problem conditions. These results establish SLR-ES as a scalable and versatile solution for high-dimensional optimization.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Low-rank diffusion matrix estimation for high-dimensional time-changed Levy processes
    Belomestny, Denis
    Trabs, Mathias
    ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2018, 54 (03): : 1583 - 1621
  • [42] Denoising by low-rank and sparse representations
    Nejati, Mansour
    Samavi, Shadrokh
    Derksen, Harm
    Najarian, Kayvan
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2016, 36 : 28 - 39
  • [43] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [44] Sparse and Low-Rank Matrix Decompositions
    Chandrasekaran, Venkat
    Sanghavi, Sujay
    Parrilo, Pablo A.
    Willsky, Alan S.
    2009 47TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, VOLS 1 AND 2, 2009, : 962 - +
  • [45] Low-Rank and Sparse Optimization for GPCA with Applications to SARX system Identification
    Konishi, Katsumi
    2013 EUROPEAN CONTROL CONFERENCE (ECC), 2013, : 2687 - 2692
  • [46] Multichannel sleep spindle detection using sparse low-rank optimization
    Parekh, Ankit
    Selesnick, Ivan W.
    Osorio, Ricardo S.
    Varga, Andrew W.
    Rapoport, David M.
    Ayappa, Indu
    JOURNAL OF NEUROSCIENCE METHODS, 2017, 288 : 1 - 16
  • [47] Image Tag Completion Based on Low-Rank Sparse Decomposition and Optimization
    Meng L.
    Zhang S.
    Hu L.
    Zhang J.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2020, 32 (01): : 36 - 44
  • [48] Generalized Sparse and Low-Rank Optimization for Ultra-Dense Networks
    Shi, Yuanming
    Zhang, Jun
    Chen, Wei
    Letaief, Khaled B.
    IEEE COMMUNICATIONS MAGAZINE, 2018, 56 (06) : 42 - 48
  • [49] Repairing Sparse Low-Rank Texture
    Liang, Xiao
    Ren, Xiang
    Zhang, Zhengdong
    Ma, Yi
    COMPUTER VISION - ECCV 2012, PT V, 2012, 7576 : 482 - 495
  • [50] Learning Nonlocal Sparse and Low-Rank Models for Image Compressive Sensing: Nonlocal sparse and low-rank modeling
    Zha, Zhiyuan
    Wen, Bihan
    Yuan, Xin
    Ravishankar, Saiprasad
    Zhou, Jiantao
    Zhu, Ce
    IEEE SIGNAL PROCESSING MAGAZINE, 2023, 40 (01) : 32 - 44