共 50 条
- [1] FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1223 - 1228
- [2] Rectified Exponential Units for Convolutional Neural Networks [J]. IEEE ACCESS, 2019, 7 : 101633 - 101640
- [3] Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
- [4] Exploring Normalization in Deep Residual Networks with Concatenated Rectified Linear Units [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1509 - 1516
- [5] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING RECTIFIED LINEAR UNITS AND DROPOUT [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8609 - 8613
- [6] Spam Filtering Using Regularized Neural Networks with Rectified Linear Units [J]. AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 65 - 75
- [8] Hyperbolic Linear Units for Deep Convolutional Neural Networks [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 353 - 359
- [9] Elastic exponential linear units for convolutional neural networks [J]. NEUROCOMPUTING, 2020, 406 : 253 - 266
- [10] Improving deep convolutional neural networks with mixed maxout units [J]. PLOS ONE, 2017, 12 (07):