Penalty hyperparameter optimization with diversity measure for nonnegative low-rank approximation

被引:1
|
作者
Del Buono, Nicoletta [1 ]
Esposito, Flavia [1 ]
Selicato, Laura [1 ,2 ]
Zdunek, Rafal [3 ]
机构
[1] Univ Bari Aldo Moro, Dept Math, Via Orabona 4, I-70125 Bari, Italy
[2] Natl Res Council IRSA CNR, Water Res Inst, Vle Francesco Blasio 5, I-70132 Bari, Italy
[3] Wroclaw Univ Sci & Technol, Fac Elect Photon & Microsyst, 27 Wybrzeze Wyspianskiego St, PL-50370 Wroclaw, Poland
关键词
Hyperparameter optimization; Penalty coefficient; Sparseness; Low-rank approximation; DIVERGENCE; ALGORITHMS; POWER;
D O I
10.1016/j.apnum.2024.10.002
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Learning tasks are often based on penalized optimization problems in which a sparse solution is desired. This can lead to more interpretative results by identifying a smaller subset of important features or components and reducing the dimensionality of the data representation, as well. In this study, we propose a new method to solve a constrained Frobenius norm-based nonnegative low- rank approximation, and the tuning of the associated penalty hyperparameter, simultaneously. The penalty term added is a particular diversity measure that is more effective for sparseness purposes than other classical norm-based penalties (i.e., 81 or 8 2,1 norms). As it is well known, setting the hyperparameters of an algorithm is not an easy task. Our work drew on developing an optimization method and the corresponding algorithm that simultaneously solves the sparsity-constrained nonnegative approximation problem and optimizes its associated penalty hyperparameters. We test the proposed method by numerical experiments and show its promising results on several synthetic and real datasets.
引用
收藏
页码:189 / 204
页数:16
相关论文
共 50 条
  • [31] Structured Low-Rank Approximation: Optimization on Matrix Manifold Approach
    Saha T.
    Khare S.
    International Journal of Applied and Computational Mathematics, 2021, 7 (6)
  • [32] Low-Rank Approximation: Algorithms, Implementation, Approximation
    Khoromskij, Boris N.
    SIAM REVIEW, 2021, 63 (04) : 870 - 871
  • [33] LOW-RANK PHYSICAL MODEL RECOVERY FROM LOW-RANK SIGNAL APPROXIMATION
    Hayes, Charles Ethan
    McClellan, James H.
    Scott, Waymond R., Jr.
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 3131 - 3135
  • [34] Similarity Measure based on Low-Rank Approximation for Highly Scalable Recommender Systems
    Seifzadeh, Sepideh
    Miri, Ali
    2015 IEEE TRUSTCOM/BIGDATASE/ISPA, VOL 2, 2015, : 66 - 71
  • [35] NysADMM: faster composite convex optimization via low-rank approximation
    Zhao, Shipu
    Frangella, Zachary
    Udell, Madeleine
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [36] Constrained Optimization Based Low-Rank Approximation of Deep Neural Networks
    Li, Chong
    Shi, C. J. Richard
    COMPUTER VISION - ECCV 2018, PT X, 2018, 11214 : 746 - 761
  • [37] Multiscale Decomposition in Low-Rank Approximation
    Abdolali, Maryam
    Rahmati, Mohammad
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (07) : 1015 - 1019
  • [38] SIMPLICIAL APPROXIMATION AND LOW-RANK TREES
    GILLET, H
    SHALEN, PB
    SKORA, RK
    COMMENTARII MATHEMATICI HELVETICI, 1991, 66 (04) : 521 - 540
  • [39] Enhanced Low-Rank Matrix Approximation
    Parekh, Ankit
    Selesnick, Ivan W.
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (04) : 493 - 497
  • [40] Modifiable low-rank approximation to a matrix
    Barlow, Jesse L.
    Erbay, Hasan
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2009, 16 (10) : 833 - 860