共 50 条
- [22] TensorDash: Exploiting Sparsity to Accelerate Deep Neural Network Training 2020 53RD ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE (MICRO 2020), 2020, : 781 - 795
- [23] Sparsity-Aware Caches to Accelerate Deep Neural Networks PROCEEDINGS OF THE 2020 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2020), 2020, : 85 - 90
- [24] Chordal Sparsity for Lipschitz Constant Estimation of Deep Neural Networks 2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 3389 - 3396
- [25] POSTER: Exploiting the Input Sparsity to Accelerate Deep Neural Networks PROCEEDINGS OF THE 24TH SYMPOSIUM ON PRINCIPLES AND PRACTICE OF PARALLEL PROGRAMMING (PPOPP '19), 2019, : 401 - 402
- [26] Variance-Guided Structured Sparsity in Deep Neural Networks IEEE Transactions on Artificial Intelligence, 2023, 4 (06): : 1714 - 1723
- [27] Acorns: A Framework for Accelerating Deep Neural Networks with Input Sparsity 2019 28TH INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES (PACT 2019), 2019, : 178 - 191
- [28] Sparsity-aware generalization theory for deep neural networks THIRTY SIXTH ANNUAL CONFERENCE ON LEARNING THEORY, VOL 195, 2023, 195
- [30] Compressing Deep Neural Networks using a Rank-Constrained Topology 16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 1473 - 1477