Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout

被引:1
|
作者
Chen, Hao [1 ]
Barthel, Thomas [2 ,3 ]
机构
[1] Swiss Fed Inst Technol, Dept Phys, CH-8093 Zurich, Switzerland
[2] Duke Univ, Dept Phys, Durham, NC 27708 USA
[3] Duke Univ, Duke Quantum Ctr, Durham, NC 27708 USA
关键词
Machine learning; image classification; tensor networks; tree tensor networks; CP rank; tensor dropout; MATRIX RENORMALIZATION-GROUP; STATES; APPROXIMATION; MODELS;
D O I
10.1109/TPAMI.2024.3396386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor networks developed in the context of condensed matter physics try to approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b = 4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.
引用
收藏
页码:7825 / 7832
页数:8
相关论文
共 50 条
  • [1] Tensor chain and constraints in tensor networks
    Yi Ling
    Yuxuan Liu
    Zhuo-Yu Xian
    Yikang Xiao
    Journal of High Energy Physics, 2019
  • [2] Tensor chain and constraints in tensor networks
    Ling, Yi
    Liu, Yuxuan
    Xian, Zhuo-Yu
    Xiao, Yikang
    JOURNAL OF HIGH ENERGY PHYSICS, 2019, 2019 (06)
  • [3] Tensor networks for unsupervised machine learning
    Liu, Jing
    Li, Sujie
    Zhang, Jiang
    Zhang, Pan
    PHYSICAL REVIEW E, 2023, 107 (01)
  • [4] Tensor networks for quantum machine learning
    Rieser, Hans-Martin
    Koester, Frank
    Raulf, Arne Peter
    PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2023, 479 (2275):
  • [5] Tensor Dropout for Robust Learning
    Kolbeinsson, Arinbjorn
    Kossaifi, Jean
    Panagakis, Yannis
    Bulat, Adrian
    Anandkumar, Animashree
    Tzoulaki, Ioanna
    Matthews, Paul M.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2021, 15 (03) : 630 - 640
  • [6] Tensor dropout for robust learning
    Kolbeinsson, Arinbjorn
    Kossaifi, Jean
    Panagakis, Yannis
    Bulat, Adrian
    Kumar, Animashree Anand
    Tzoulaki, Ioanna
    Matthews, Paul M.
    IEEE Journal on Selected Topics in Signal Processing, 2021, 15 (03): : 630 - 640
  • [7] RANK-ADAPTIVE TIME INTEGRATION OF TREE TENSOR NETWORKS
    Ceruti, Gianluca
    Lubich, Christian
    Sulz, Dominik
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2023, 61 (01) : 194 - 222
  • [8] Tensor Convolutional Dictionary Learning With CP Low-Rank Activations
    Humbert, Pierre
    Oudre, Laurent
    Vayatis, Nicolas
    Audiffren, Julien
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 785 - 796
  • [9] Tensor rank learning in CP decomposition via convolutional neural network
    Zhou, Mingyi
    Liu, Yipeng
    Long, Zhen
    Chen, Longxi
    Zhu, Ce
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 : 12 - 21
  • [10] Tensor Manifold with Tucker Rank Constraints
    Chang, Shih Yu
    Luo, Ziyan
    Qi, Liqun
    ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH, 2022, 39 (02)