共 50 条
- [1] FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1223 - 1228
- [2] Elastic exponential linear units for convolutional neural networks [J]. NEUROCOMPUTING, 2020, 406 : 253 - 266
- [3] Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
- [4] Improved Learning in Convolutional Neural Networks with Shifted Exponential Linear Units (ShELUs) [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 517 - 522
- [5] Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
- [6] Spam Filtering Using Regularized Neural Networks with Rectified Linear Units [J]. AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 65 - 75
- [8] One-Dimensional Convolutional Neural Networks Based on Exponential Linear Units for Bearing Fault Diagnosis [J]. 2018 CHINESE AUTOMATION CONGRESS (CAC), 2018, : 1052 - 1057
- [10] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING RECTIFIED LINEAR UNITS AND DROPOUT [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8609 - 8613