Exploring high-dimensional optimization by sparse and low-rank evolution strategy

被引:0
|
作者
Li, Zhenhua [1 ,2 ]
Wu, Wei [1 ]
Zhang, Qingfu [3 ]
Cai, Xinye [4 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Peoples R China
[2] MIIT Key Lab Pattern Anal & Machine Intelligence, Nanjing, Peoples R China
[3] City Univ Hong Kong, Dept Comp Sci, Hong Kong 999077, Peoples R China
[4] Dalian Univ Technol, Sch Control Sci & Engn, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
Black-box optimization; Large-scale optimization; Evolution strategies; Sparse plus low-rank model; LOCAL SEARCH; SCALE; ADAPTATION; CMA;
D O I
10.1016/j.swevo.2024.101828
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Evolution strategies (ESs) area robust family of algorithms for black-box optimization, yet their applicability to high-dimensional problems remains constrained by computational challenges. To address this, we propose a novel evolution strategy, SLR-ES, leveraging a sparse plus low-rank covariance matrix model. The sparse component utilizes a diagonal matrix to exploit separability along coordinate axes, while the low-rank component identifies promising subspaces and parameter dependencies. To maintain distribution fidelity, we introduce a decoupled update mechanism for the model parameters. Comprehensive experiments demonstrate that SLR-ES achieves state-of-the-art performance on both separable and non-separable functions. Furthermore, evaluations on the CEC'2010 and CEC'2013 large-scale global optimization benchmarks reveal consistent superiority in average ranking, highlighting the algorithm's robustness across diverse problem conditions. These results establish SLR-ES as a scalable and versatile solution for high-dimensional optimization.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] High-Dimensional Optimization using Diagonal and Low-Rank Evolution Strategy
    Wu, Wei
    Li, Zhenhua
    2024 6TH INTERNATIONAL CONFERENCE ON DATA-DRIVEN OPTIMIZATION OF COMPLEX SYSTEMS, DOCS 2024, 2024, : 324 - 331
  • [2] Reconstruction of a high-dimensional low-rank matrix
    Yata, Kazuyoshi
    Aoshima, Makoto
    ELECTRONIC JOURNAL OF STATISTICS, 2016, 10 (01): : 895 - 917
  • [3] ESTIMATION OF HIGH-DIMENSIONAL LOW-RANK MATRICES
    Rohde, Angelika
    Tsybakov, Alexandre B.
    ANNALS OF STATISTICS, 2011, 39 (02): : 887 - 930
  • [4] High-dimensional VAR with low-rank transition
    Alquier, Pierre
    Bertin, Karine
    Doukhan, Paul
    Garnier, Remy
    STATISTICS AND COMPUTING, 2020, 30 (04) : 1139 - 1153
  • [5] High-dimensional VAR with low-rank transition
    Pierre Alquier
    Karine Bertin
    Paul Doukhan
    Rémy Garnier
    Statistics and Computing, 2020, 30 : 1139 - 1153
  • [6] Accelerated High-Dimensional MR Imaging With Sparse Sampling Using Low-Rank Tensors
    He, Jingfei
    Liu, Qiegen
    Christodoulou, Anthony G.
    Ma, Chao
    Lam, Fan
    Liang, Zhi-Pei
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2016, 35 (09) : 2119 - 2129
  • [7] High-dimensional sparse low-rank multi-objective matrix regression model and its portfolio management strategy
    Li, Aizhong
    Ren, Ruoen
    Li, Zekai
    Dong, Jichang
    1600, Systems Engineering Society of China (40): : 2292 - 2301
  • [8] Low-rank Riemannian eigensolver for high-dimensional Hamiltonians
    Rakhuba, Maxim
    Novikov, Alexander
    Oseledets, Ivan
    JOURNAL OF COMPUTATIONAL PHYSICS, 2019, 396 : 718 - 737
  • [9] Low-Rank Bandit Methods for High-Dimensional Dynamic Pricing
    Mueller, Jonas
    Syrgkanis, Vasilis
    Taddy, Matt
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Low-rank numerical approximations for high-dimensional Lindblad equations
    Le Bris, C.
    Rouchon, P.
    PHYSICAL REVIEW A, 2013, 87 (02):