Dynamic Convolutional Neural Networks as Efficient Pre-Trained Audio Models

被引:0
|
作者
Schmid, Florian [1 ]
Koutini, Khaled [1 ,2 ]
Widmer, Gerhard [1 ,2 ]
机构
[1] Johannes Kepler Univ Linz, Inst Computat Percept CP JKU, A-4040 Linz, Austria
[2] Johannes Kepler Univ Linz, LIT Artificial Intelligence Lab, A-4040 Linz, Austria
基金
欧洲研究理事会;
关键词
Dynamic convolutional neural networks; dynamic convolution; dynamic ReLU; coordinate attention; audio spectrogram transformer; audio classification; pre-trained audio models; knowledge distillation;
D O I
10.1109/TASLP.2024.3376984
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
The introduction of large-scale audio datasets, such as AudioSet, paved the way for Transformers to conquer the audio domain and replace CNNs as the state-of-the-art neural network architecture for many tasks. Audio Spectrogram Transformers are excellent at exploiting large datasets, creating powerful pre-trained models that surpass CNNs when fine-tuned on downstream tasks. However, current popular Audio Spectrogram Transformers are demanding in terms of computational complexity compared to CNNs. Recently, we have shown that, by employing Transformer-to-CNN Knowledge Distillation, efficient CNNs can catch up with and even outperform Transformers on large datasets. In this work, we extend this line of research and increase the capacity of efficient CNNs by introducing dynamic CNN blocks constructed of dynamic convolutions, a dynamic ReLU activation function, and Coordinate Attention. We show that these dynamic CNNs outperform traditional efficient CNNs, such as MobileNets, in terms of the performance-complexity trade-off at the task of audio tagging on the large-scale AudioSet. Our experiments further indicate that the proposed dynamic CNNs achieve competitive performance with Transformer-based models for end-to-end fine-tuning on downstream tasks while being much more computationally efficient.
引用
收藏
页码:2227 / 2241
页数:15
相关论文
共 50 条
  • [41] Convolutional Neural Networks for Histopathology Image Classification: Training vs. Using Pre-Trained Networks
    Kieffer, Brady
    Babaie, Morteza
    Kalra, Shivam
    Tizhoosh, H. R.
    [J]. PROCEEDINGS OF THE 2017 SEVENTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA 2017), 2017,
  • [42] Teaming Up Pre-Trained Deep Neural Networks
    Deabes, Wael
    Abdel-Hakim, Alaa E.
    [J]. 2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND INFORMATION SECURITY (ICSPIS), 2018, : 73 - 76
  • [43] CLASSIFICATION OF NOISE BETWEEN FLOORS IN A BUILDING USING PRE-TRAINED DEEP CONVOLUTIONAL NEURAL NETWORKS
    Choi, Hwiyong
    Lee, Seungjun
    Yang, Haesang
    Seong, Woojae
    [J]. 2018 16TH INTERNATIONAL WORKSHOP ON ACOUSTIC SIGNAL ENHANCEMENT (IWAENC), 2018, : 535 - 539
  • [44] Hyperparameter optimization of pre-trained convolutional neural networks using adolescent identity search algorithm
    Ebubekir Akkuş
    Ufuk Bal
    Fatma Önay Koçoğlu
    Selami Beyhan
    [J]. Neural Computing and Applications, 2024, 36 : 1523 - 1537
  • [45] Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories
    Hacene, Ghouthi Boukli
    Gripon, Vincent
    Farrugia, Nicolas
    Arzel, Matthieu
    Jezequel, Michel
    [J]. 2017 IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2017,
  • [46] A Comparative Study of Three Pre-trained Convolutional Neural Networks in the Detection of Violence Against Women
    Aguilar, Ivan Gaytan
    Contreras, Alejandro Aguilar
    Eleuterio, Roberto Alejo
    Lara, Erendira Rendon
    Pina, Grisel Miranda
    Gutierrez, Everardo E. Granda
    [J]. CIENCIA ERGO-SUM, 2023, 31 (02)
  • [47] Adapting the pre-trained convolutional neural networks to improve the anomaly detection and classification in mammographic images
    Saber, Abeer
    Hussien, Abdelazim G.
    Awad, Wael A.
    Mahmoud, Amena
    Allakany, Alaa
    [J]. SCIENTIFIC REPORTS, 2023, 13 (01)
  • [48] Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories
    Hacene, Ghouthi Boukli
    Gripon, Vincent
    Farrugia, Nicolas
    Arzel, Matthieu
    Jezequel, Michel
    [J]. JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2019, 91 (09): : 1063 - 1073
  • [49] Adapting the pre-trained convolutional neural networks to improve the anomaly detection and classification in mammographic images
    Abeer Saber
    Abdelazim G. Hussien
    Wael A. Awad
    Amena Mahmoud
    Alaa Allakany
    [J]. Scientific Reports, 13
  • [50] Hyperparameter optimization of pre-trained convolutional neural networks using adolescent identity search algorithm
    Akkus, Ebubekir
    Bal, Ufuk
    Kocoglu, Fatma Oenay
    Beyhan, Selami
    [J]. NEURAL COMPUTING & APPLICATIONS, 2024, 36 (04): : 1523 - 1537