RETRACTION: COVID-19 Forecast and Bank Credit Decision Model Based on BiLSTM-Attention Network (Retraction of Vol 16, art no 159, 2023)

被引:0
|
作者
Zhang, Beiqin [1 ]
机构
[1] Cardiff Univ, Business Sch, 1-6 St Andrews Pl,Crown Pl-112, Cardiff CF10 3BE, Wales
基金
中国国家自然科学基金;
关键词
Keyword spotting; Swin-Transformer; Temporal Convolutional Network; Window self-attention mechanism;
D O I
10.1007/s44196-024-00473-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid advancements in deep learning technology, the Transformer-based attention neural network has shown promising performance in keyword spotting (KWS). However, this method suffers from high computational cost since the excessive parameters in the Transformer model and the computational burden of global attention, which limit its applicability in a resource-constrained KWS scenario. To overcome this issue, we propose a novel Swin-Transformer based KWS method. In this approach, first extract dynamic features using Temporal Convolutional Network (TCN) from input Mel-Frequency Cepstral Coefficients (MFCCs). Then, the Swin-Transformer is employed to capture hierarchical multi-scale features, where a window attention is designed to grasp dynamic time–frequency features. Furthermore, to enhance the extraction of contextual information from the spectrogram, a frame-level shifted window attention mechanism is proposed to enhance the inter-window interaction, thus extracting more contextual information from the spectrogram. Experimental results on the speech command V1 dataset verify the effectiveness of the proposal, which achieves a recognition accuracy of 98.01% with less model parameters, outperforming existing KWS methods. © The Author(s) 2024.
引用
收藏
页数:1
相关论文
共 43 条