共 50 条
- [31] Deep ReLU Networks Have Surprisingly Few Activation Patterns ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [35] HOW DO NOISE TAILS IMPACT ON DEEP RELU NETWORKS? ANNALS OF STATISTICS, 2024, 52 (04): : 1845 - 1871
- [36] Drawing Robust Scratch Tickets: Subnetworks with Inborn Robustness Are Found within Randomly Initialized Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
- [37] Learning Functions Generated by Randomly Initialized MLPs and SRNs CICA: 2009 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN CONTROL AND AUTOMATION, 2009, : 62 - 69
- [38] Gradient descent optimizes over-parameterized deep ReLU networks Machine Learning, 2020, 109 : 467 - 492
- [39] On the CVP for the root lattices via folding with deep ReLU neural networks 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1622 - 1626
- [40] DEEP RELU NETWORKS OVERCOME THE CURSE OF DIMENSIONALITY FOR GENERALIZED BANDLIMITED FUNCTIONS JOURNAL OF COMPUTATIONAL MATHEMATICS, 2021, 39 (06): : 801 - 815