Sparse nonnegative tensor decomposition using proximal algorithm and inexact block coordinate descent scheme

被引:0
|
作者
Deqing Wang
Zheng Chang
Fengyu Cong
机构
[1] Dalian University of Technology,School of Biomedical Engineering, Faculty of Electronic Information and Electrical Engineering
[2] University of Jyväskylä,Faculty of Information Technology
[3] University of Electronic Science and Technology of China,School of Computer Science and Engineering
[4] Dalian University of Technology,School of Artificial Intelligence, Faculty of Electronic Information and Electrical Engineering
[5] Dalian University of Technology,Key Laboratory of Integrated Circuit and Biomedical Electronic System, Liaoning Province
来源
关键词
Tensor decomposition; Nonnegative CANDECOMP/PARAFAC decomposition; Sparse regularization; Proximal algorithm; Inexact block coordinate descent;
D O I
暂无
中图分类号
学科分类号
摘要
Nonnegative tensor decomposition is a versatile tool for multiway data analysis, by which the extracted components are nonnegative and usually sparse. Nevertheless, the sparsity is only a side effect and cannot be explicitly controlled without additional regularization. In this paper, we investigated the nonnegative CANDECOMP/PARAFAC (NCP) decomposition with the sparse regularization item using l1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_1$$\end{document}-norm (sparse NCP). When high sparsity is imposed, the factor matrices will contain more zero components and will not be of full column rank. Thus, the sparse NCP is prone to rank deficiency, and the algorithms of sparse NCP may not converge. In this paper, we proposed a novel model of sparse NCP with the proximal algorithm. The subproblems in the new model are strongly convex in the block coordinate descent (BCD) framework. Therefore, the new sparse NCP provides a full column rank condition and guarantees to converge to a stationary point. In addition, we proposed an inexact BCD scheme for sparse NCP, where each subproblem is updated multiple times to speed up the computation. In order to prove the effectiveness and efficiency of the sparse NCP with the proximal algorithm, we employed two optimization algorithms to solve the model, including inexact alternating nonnegative quadratic programming and inexact hierarchical alternating least squares. We evaluated the proposed sparse NCP methods by experiments on synthetic, real-world, small-scale, and large-scale tensor data. The experimental results demonstrate that our proposed algorithms can efficiently impose sparsity on factor matrices, extract meaningful sparse components, and outperform state-of-the-art methods.
引用
收藏
页码:17369 / 17387
页数:18
相关论文
共 50 条
  • [21] A new penalized nonnegative third-order tensor decomposition using a block coordinate proximal gradient approach: Application to 3D fluorescence spectroscopy
    Xuan Vu
    Chaux, Caroline
    Thirion-Moreau, Nadege
    Maire, Sylvain
    Carstea, Elfrida Mihaela
    [J]. JOURNAL OF CHEMOMETRICS, 2017, 31 (04)
  • [22] A Decomposition And Coordination Algorithm For Reactive Power Optimization Based On Block Coordinate Descent
    Liu, Jun
    He, Weiguo
    Zhan, Xiaoyin
    Li, Zhi
    [J]. 2012 CHINA INTERNATIONAL CONFERENCE ON ELECTRICITY DISTRIBUTION (CICED), 2012,
  • [23] Doubly Accelerated Proximal Gradient for Nonnegative Tensor Decomposition
    Wang, Deqing
    [J]. ADVANCES IN NEURAL NETWORKS-ISNN 2024, 2024, 14827 : 55 - 65
  • [24] DOA Estimation Using a Greedy Block Coordinate Descent Algorithm
    Wei, Xiaohan
    Yuan, Yabo
    Ling, Qing
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (12) : 6382 - 6394
  • [25] Fast block coordinate descent for sparse group lasso
    Ida Y.
    Fujiwara Y.
    Kashima H.
    [J]. Transactions of the Japanese Society for Artificial Intelligence, 2021, 36 (01) : 1 - 11
  • [26] Accelerated Block Coordinate Descent for Sparse Group Lasso
    Catalina, Alejandro
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    [J]. 2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [27] Convergence of Slice-Based Block Coordinate Descent Algorithm for Convolutional Sparse Coding
    Li, Jing
    Yu, Hui
    Wei, Xiao
    Wang, Jinjia
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
  • [28] SAR CHANGE IMAGING IN THE SPARSE TRANSFORM DOMAIN BASED ON BLOCK COORDINATE DESCENT ALGORITHM
    Chen, Wenjiao
    Geng, Jiwen
    Guo, Yukun
    Yu, Ze
    [J]. IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 7085 - 7088
  • [29] A Block Coordinate Descent Algorithm for Sparse Gaussian Graphical Model Inference with Laplacian Constraints
    Liu, Tianyi
    Minh Trinh Hoang
    Yang, Yang
    Pesavento, Marius
    [J]. 2019 IEEE 8TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP 2019), 2019, : 236 - 240
  • [30] Homotopy algorithm using dichotomous coordinate descent iterations for sparse recovery
    Zakharov, Yuriy
    Nascimento, Vitor H.
    [J]. 2012 CONFERENCE RECORD OF THE FORTY SIXTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS (ASILOMAR), 2012, : 820 - 824