Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions

被引:2
|
作者
Ali, Mazen [1 ]
Nouy, Anthony [2 ]
机构
[1] Fraunhofer ITWM, D-67663 Kaiserslautern, Germany
[2] Nantes Univ, Cent Nantes, LMJL UMR CNRS 6629, Nantes, France
关键词
Tensor networks; Tensor trains; Matrix product states; Neural networks; Approximation spaces; Besov spaces; Direct (Jackson) and inverse (Bernstein) inequalities; HACKBUSCH CONJECTURE;
D O I
10.1007/s00365-023-09620-w
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We study the approximation of univariate functions by combining tensorization of functions with tensor trains (TTs)-a commonly used type of tensor networks (TNs). Lebesgue L-P-spaces in one dimension can be identified with tensor product spaces of arbitrary order through tensorization. We use this tensor product structure to define different approximation tools and corresponding approximation spaces of TTs, associated with different measures of complexity. The approximation tools are shown to have (near to) optimal approximation rates for functions with classical Besov smoothness. We then use classical interpolation theory to show that a scale of interpolated smoothness spaces is continuously embedded into the scale of TT approximation spaces and, vice versa, we show that the TT approximation spaces are, in a sense, much larger than smoothness spaces when the depth of the tensor network is not restricted but are embedded into a scale of interpolated smoothness spaces if one restricts the depth. The results of this work can be seen as both an analysis of the approximation spaces of a type of TNs and a study of the expressivity of a particular type of neural networks (NNs)-namely feed-forward sum-product networks with sparse architecture. We point out interesting parallels to recent results on the expressivity of rectifier networks.
引用
收藏
页码:463 / 544
页数:82
相关论文
共 50 条
  • [31] SPARSE TENSOR PRODUCT WAVELET APPROXIMATION OF SINGULAR FUNCTIONS
    Dauge, Monique
    Stevenson, Rob
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2010, 42 (05) : 2203 - 2228
  • [32] Automatic structural optimization of tree tensor networks
    Hikihara, Toshiya
    Ueda, Hiroshi
    Okunishi, Kouichi
    Harada, Kenji
    Nishino, Tomotoshi
    PHYSICAL REVIEW RESEARCH, 2023, 5 (01):
  • [33] Hybrid Tree Tensor Networks for Quantum Simulation
    Schuhmacher, Julian
    Ballarin, Marco
    Baiardi, Alberto
    Magnifico, Giuseppe
    Tacchino, Francesco
    Montangero, Simone
    Tavernelli, Ivano
    PRX QUANTUM, 2025, 6 (01):
  • [34] BOGEL FUNCTIONS, TENSOR-PRODUCTS AND BLENDING APPROXIMATION
    BADEA, C
    COTTIN, C
    GONSKA, HH
    MATHEMATISCHE NACHRICHTEN, 1995, 173 : 25 - 48
  • [36] Approximation theory and neural networks
    Mhaskar, HN
    WAVELETS AND ALLIED TOPICS, 2001, : 247 - 289
  • [37] Neural networks and approximation theory
    Mhaskar, HN
    NEURAL NETWORKS, 1996, 9 (04) : 721 - 722
  • [38] NEURAL NETWORKS AND THE APPROXIMATION THEORY
    Enachescu, Calin
    PROCEEDINGS OF THE EUROPEAN INTEGRATION: BETWEEN TRADITION AND MODERNITY, VOL 5, 2013, 5 : 1155 - 1164
  • [39] Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation
    Phan, Anh-Huy
    Cichocki, Andrzej
    Uschmajew, Andre
    Tichavsky, Petr
    Luta, George
    Mandic, Danilo P.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (11) : 4622 - 4636
  • [40] TENSOR APPROXIMATION OF STATIONARY DISTRIBUTIONS OF CHEMICAL REACTION NETWORKS
    Kazeev, Vladimir
    Schwab, Christoph
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2015, 36 (03) : 1221 - 1247