Thank you for attention: A survey on attention-based artificial neural networks for automatic speech recognition

被引:0
|
作者
Karmakar, Priyabrata [1 ]
Teng, Shyh Wei [1 ]
Lu, Guojun [2 ]
机构
[1] Federat Univ, Inst Innovat Sci & Sustainabil, Ballarat, Australia
[2] Federat Univ, Global Profess Sch, Ballarat, Australia
来源
关键词
Automatic speech recognition (ASR); Attention mechanism; Recurrent neural network (RNN); Transformer; Offline ASR; Streaming ASR; SELF-ATTENTION;
D O I
10.1016/j.iswa.2024.200406
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Attention is a very popular and effective mechanism in artificial neural network-based sequence-to-sequence models. In this survey paper, a comprehensive review of the different attention models used in developing automatic speech recognition systems is provided. The paper focuses on how attention models have grown and changed for offline and streaming speech recognition in recurrent neural networks and Transformer-based systems.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Acoustic-based LEGO recognition using attention-based convolutional neural networks
    Van-Thuan Tran
    Chia-Yang Wu
    Wei-Ho Tsai
    [J]. Artificial Intelligence Review, 2024, 57
  • [22] Effective Exploitation of Posterior Information for Attention-Based Speech Recognition
    Tang, Jian
    Hou, Junfeng
    Song, Yan
    Dai, Li-Rong
    McLoughlin, Ian
    [J]. IEEE ACCESS, 2020, 8 (08): : 108988 - 108999
  • [23] Attention-based Contextual Language Model Adaptation for Speech Recognition
    Martinez, Richard Diehl
    Novotney, Scott
    Bulyko, Ivan
    Rastrow, Ariya
    Stolcke, Andreas
    Gandhe, Ankur
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 1994 - 2003
  • [24] Attention-based Convolutional Neural Networks for Sentence Classification
    Zhao, Zhiwei
    Wu, Youzheng
    [J]. 17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 705 - 709
  • [25] A visual attention-based approach for automatic landmark selection and recognition
    Ouerhani, N
    Hügli, H
    Gruener, G
    Codourey, A
    [J]. ATTENTION AND PERFORMANCE IN COMPUTATIONAL VISION, 2005, 3368 : 183 - 195
  • [26] Causal Discovery with Attention-Based Convolutional Neural Networks
    Nauta, Meike
    Bucur, Doina
    Seifert, Christin
    [J]. MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2019, 1 (01):
  • [27] Seizure localisation with attention-based graph neural networks
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    Wennberg, Richard
    Valiante, Taufik A.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 203
  • [28] Hybrid data augmentation and deep attention-based dilated convolutional-recurrent neural networks for speech emotion recognition
    Pham, Nhat Truong
    Dang, Duc Ngoc Minh
    Nguyen, Ngoc Duy
    Nguyen, Thanh Thi
    Nguyen, Hai
    Manavalan, Balachandran
    Lim, Chee Peng
    Nguyen, Sy Dzung
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 230
  • [29] Signal Peptides Generated by Attention-Based Neural Networks
    Wu, Zachary
    Yang, Kevin K.
    Liszka, Michael J.
    Lee, Alycia
    Batzilla, Alina
    Wernick, David
    Weiner, David P.
    Arnold, Frances H.
    [J]. ACS SYNTHETIC BIOLOGY, 2020, 9 (08): : 2154 - 2161
  • [30] Demystifying Oversmoothing in Attention-Based Graph Neural Networks
    Wu, Xinyi
    Ajorlou, Amir
    Wu, Zihui
    Jadbabaie, Ali
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,