Improved Bounds on Neural Complexity for Representing Piecewise Linear Functions

被引:0
|
作者
Chen, Kuan-Lin [1 ]
Garudadri, Harinath [2 ]
Rao, Bhaskar D. [1 ]
机构
[1] Univ Calif San Diego, Dept Elect & Comp Engn, La Jolla, CA 92093 USA
[2] Univ Calif San Diego, Qualcomm Inst, La Jolla, CA 92093 USA
关键词
POLYNOMIAL-TIME ALGORITHM; NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A deep neural network using rectified linear units represents a continuous piecewise linear (CPWL) function and vice versa. Recent results in the literature estimated that the number of neurons needed to exactly represent any CPWL function grows exponentially with the number of pieces or exponentially in terms of the factorial of the number of distinct linear components. Moreover, such growth is amplified linearly with the input dimension. These existing results seem to indicate that the cost of representing a CPWL function is expensive. In this paper, we propose much tighter bounds and establish a polynomial time algorithm to find a network satisfying these bounds for any given CPWL function. We prove that the number of hidden neurons required to exactly represent any CPWL function is at most a quadratic function of the number of pieces. In contrast to all previous results, this upper bound is invariant to the input dimension. Besides the number of pieces, we also study the number of distinct linear components in CPWL functions. When such a number is also given, we prove that the quadratic complexity turns into bilinear, which implies a lower neural complexity because the number of distinct linear components is always not greater than the minimum number of pieces in a CPWL function. When the number of pieces is unknown, we prove that, in terms of the number of distinct linear components, the neural complexities of any CPWL function are at most polynomial growth for low-dimensional inputs and factorial growth for the worst-case scenario, which are significantly better than existing results in the literature.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Hopfield neural network: The hyperbolic tangent and the piecewise-linear activation functions
    Mathias, Amanda C.
    Rech, Paulo C.
    NEURAL NETWORKS, 2012, 34 : 42 - 45
  • [32] Non-uniform Piecewise Linear Activation Functions in Deep Neural Networks
    Zhu, Zezhou
    Dong, Yuan
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2107 - 2113
  • [33] Nearly-tight VC-dimension and pseudodimension bounds for piecewise linear neural networks
    Bartlett, Peter L.
    Harvey, Nick
    Liaw, Christopher
    Mehrabian, Abbas
    Journal of Machine Learning Research, 2019, 20
  • [34] Nearly-tight VC-dimension and Pseudodimension Bounds for Piecewise Linear Neural Networks
    Bartlett, Peter L.
    Harvey, Nick
    Liaw, Christopher
    Mehrabian, Abbas
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20 : 1 - 17
  • [35] On the complexity bounds of restrictions of Boolean functions
    Chashkin, AV
    DOKLADY AKADEMII NAUK, 1996, 348 (05) : 595 - 597
  • [36] Lower bounds on representing Boolean functions as polynomials in Zm
    Department of Computer Science, University of Chicago, 1100 East 58th Street, Chicago, IL 60637
    SIAM J Discrete Math, 1 (55-62):
  • [37] Improved bounds on the sample complexity of learning
    Li, Y
    Long, PM
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2001, 62 (03) : 516 - 527
  • [38] Improved bounds on the sample complexity of learning
    Li, Y
    Long, PM
    Srinivasan, A
    PROCEEDINGS OF THE ELEVENTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2000, : 309 - 318
  • [39] Improved bounds on the complexity of graph coloring
    Mann, Zoltan Adam
    Szajko, Aniko
    12TH INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2010), 2011, : 347 - 354
  • [40] PIECEWISE-LINEAR BASIS FUNCTIONS AND PIECEWISE-LINEAR SIGNAL EXPANSIONS
    PAUL, CR
    KOCH, RW
    IEEE TRANSACTIONS ON ACOUSTICS SPEECH AND SIGNAL PROCESSING, 1974, AS22 (04): : 263 - 268