Distributed sparsity constrained optimization over the Stiefel manifold

被引:0
|
作者
Qu, Wentao [1 ]
Chen, Huangyue [2 ,3 ]
Xiu, Xianchao [4 ]
Liu, Wanquan [5 ]
机构
[1] Beijing Jiaotong Univ, Sch Math & Stat, Beijing 100044, Peoples R China
[2] Guangxi Univ, Sch Math & Informat Sci, Nanning 530004, Peoples R China
[3] Chinese Acad Sci, Inst Appl Math, Acad Math & Syst Sci, Beijing 100190, Peoples R China
[4] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai 200444, Peoples R China
[5] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Shenzhen 518107, Peoples R China
基金
中国国家自然科学基金;
关键词
Distributed optimization; Sparsity constrained optimization; Stiefel manifold; Newton method; CONSENSUS;
D O I
10.1016/j.neucom.2024.128267
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distributed optimization aims to effectively complete specified tasks through cooperation among multi- agent systems, which has achieved great success in large-scale optimization problems. However, it remains a challenging task to develop an effective distributed algorithm with theoretical guarantees, especially when dealing with nonconvex constraints. More importantly, high-dimensional data often exhibits inherent structures such as sparsity, which if exploited accurately, can significantly enhance the capture of its intrinsic characteristics. In this paper, we introduce a novel distributed sparsity constrained optimization framework over the Stiefel manifold, abbreviated as DREAM. DREAM innovatively integrates the t 2 , 0-norm constraint and Stiefel manifold constraint within a distributed optimization setting, which has not been investigated in existing literature. Unlike the existing distributed methods, the proposed DREAM not only can extract the similarity information among samples, but also more flexibly determine the number of features to be extracted. Then, we develop an efficient Newton augmented Lagrangian-based algorithm. In theory, we delve into the relationship between the minimizer, the Karush-Kuhn-Tucker point, and the stationary point, and rigorously demonstrate that the sequence generated by our algorithm converges to a stationary point. Extensive numerical experiments verify its superiority over state-of-the-art distributed methods.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] WEAKLY CONVEX OPTIMIZATION OVER STIEFEL MANIFOLD USING RIEMANNIAN SUBGRADIENT-TYPE METHODS
    Li, Xiao
    Chen, Shixiang
    Deng, Zengde
    Qu, Qing
    Zhu, Zhihui
    So, Anthony Man-Cho
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (03) : 1605 - 1634
  • [42] On matrix exponentials and their approximations related to optimization on the Stiefel manifold
    Zhu, Xiaojing
    Duan, Chunyan
    OPTIMIZATION LETTERS, 2019, 13 (05) : 1069 - 1083
  • [43] A Stochastic Consensus Method for Nonconvex Optimization on the Stiefel Manifold
    Kim, Jeongho
    Kang, Myeongju
    Kim, Dohyun
    Ha, Seung-Yeal
    Yang, Insoon
    2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 1050 - 1057
  • [44] A Nesterov-type Acceleration with Adaptive Localized Cayley Parametrization for Optimization over the Stiefel Manifold
    Kume, Keita
    Yamada, Isao
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 2105 - 2109
  • [45] A Penalty-Free Infeasible Approach for a Class of Nonsmooth Optimization Problems Over the Stiefel Manifold
    Liu, Xin
    Xiao, Nachuan
    Yuan, Ya-xiang
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 99 (02)
  • [46] Fast and Efficient MMD-Based Fair PCA via Optimization over Stiefel Manifold
    Lee, Junghyun
    Kim, Gwangsu
    Olfat, Mahbod
    Hasegawa-Johnson, Mark
    Yoo, Chang D.
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 7363 - 7371
  • [47] An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold
    Wang, Qinsi
    Yang, Wei Hong
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 89 (02) : 419 - 457
  • [48] A Scaled Gradient Projection Method for Minimization over the Stiefel Manifold
    Oviedo, Harry
    Dalmau, Oscar
    ADVANCES IN SOFT COMPUTING, MICAI 2019, 2019, 11835 : 239 - 250
  • [49] Bayesian Inference over the Stiefel Manifold via the Givens Representation
    Pourzanjani, Arya A.
    Jiang, Richard M.
    Mitchell, Brian
    Atzberger, Paul J.
    Petzold, Linda R.
    BAYESIAN ANALYSIS, 2021, 16 (02): : 639 - 666
  • [50] A framework of constraint preserving update schemes for optimization on Stiefel manifold
    Bo Jiang
    Yu-Hong Dai
    Mathematical Programming, 2015, 153 : 535 - 575