Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout

被引:1
|
作者
Chen, Hao [1 ]
Barthel, Thomas [2 ,3 ]
机构
[1] Swiss Fed Inst Technol, Dept Phys, CH-8093 Zurich, Switzerland
[2] Duke Univ, Dept Phys, Durham, NC 27708 USA
[3] Duke Univ, Duke Quantum Ctr, Durham, NC 27708 USA
关键词
Machine learning; image classification; tensor networks; tree tensor networks; CP rank; tensor dropout; MATRIX RENORMALIZATION-GROUP; STATES; APPROXIMATION; MODELS;
D O I
10.1109/TPAMI.2024.3396386
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor networks developed in the context of condensed matter physics try to approximate order-N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b = 4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.
引用
收藏
页码:7825 / 7832
页数:8
相关论文
共 50 条
  • [31] KERNEL LEARNING WITH TENSOR NETWORKS
    Konstantinidis, Kriton
    Li, Shengxi
    Mandic, Danilo P.
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2920 - 2924
  • [32] Low Tensor Rank Learning of Neural Dynamics
    Pellegrino, Arthur
    Cayco-Gajic, N. Alex
    Chadwick, Angus
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [33] Tensor Networks for Interpretable and Efficient Quantum-Inspired Machine Learning
    Ran, Shi-Ju
    Su, Gang
    Intelligent Computing, 2023, 2
  • [34] An approximation method of CP rank for third-order tensor completion
    Chao Zeng
    Tai-Xiang Jiang
    Michael K. Ng
    Numerische Mathematik, 2021, 147 : 727 - 757
  • [35] Photonic tensor cores for machine learning
    Miscuglio, Mario
    Sorger, Volker J.
    APPLIED PHYSICS REVIEWS, 2020, 7 (03):
  • [36] Low-CP-Rank Tensor Completion via Practical Regularization
    Jiang, Jiahua
    Sanogo, Fatoumata
    Navasca, Carmeliza
    JOURNAL OF SCIENTIFIC COMPUTING, 2022, 91 (01)
  • [37] Low-CP-Rank Tensor Completion via Practical Regularization
    Jiahua Jiang
    Fatoumata Sanogo
    Carmeliza Navasca
    Journal of Scientific Computing, 2022, 91
  • [38] An approximation method of CP rank for third-order tensor completion
    Zeng, Chao
    Jiang, Tai-Xiang
    Ng, Michael K.
    NUMERISCHE MATHEMATIK, 2021, 147 (03) : 727 - 757
  • [39] Iterative hard thresholding for low CP-rank tensor models
    Grotheer, R.
    Li, S.
    Ma, A.
    Needell, D.
    Qin, J.
    LINEAR & MULTILINEAR ALGEBRA, 2022, 70 (22): : 7452 - 7468
  • [40] Automatic structural optimization of tree tensor networks
    Hikihara, Toshiya
    Ueda, Hiroshi
    Okunishi, Kouichi
    Harada, Kenji
    Nishino, Tomotoshi
    PHYSICAL REVIEW RESEARCH, 2023, 5 (01):