共 50 条
- [1] Annihilation of Spurious Minima in Two-Layer ReLU Networks [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
- [2] Spurious Local Minima are Common in Two-Layer ReLU Neural Networks [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
- [3] Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks: A Tale of Symmetry II [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
- [4] Convergence Analysis of Two-layer Neural Networks with ReLU Activation [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
- [7] Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent [J]. Journal of Machine Learning Research, 2022, 23
- [8] Expressive Numbers of Two or More Hidden Layer ReLU Neural Networks [J]. 2019 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING WORKSHOPS (CANDARW 2019), 2019, : 129 - 135
- [9] Learning behavior and temporary minima of two-layer neural networks [J]. Neural Networks, 1994, 7 (09): : 1387 - 1404