SELF-ATTENTION BASED MODEL FOR PUNCTUATION PREDICTION USING WORD AND SPEECH EMBEDDINGS

被引:0
|
作者
Yi, Jiangyan [1 ]
Tao, Jianhua [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
[2] Chinese Acad Sci, CAS Ctr Excellence Brain Sci & Intelligence Techn, Beijing, Peoples R China
[3] Univ Chinese Acad Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Self-attention; transfer learning; word embedding; speech embedding; punctuation prediction; RECOGNITION; SYSTEM;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
This paper proposes to use self-attention based model to predict punctuation marks for word sequences. The model is trained using word and speech embedding features which are obtained from the pre-trainedWord2Vec and Speech2Vec, respectively. Thus, the model can use any kind of textual data and speech data. Experiments are conducted on English IWSLT2011 datasets. The results show that the self-attention based model trained using word and speech embedding features outperforms the previous state-of-the-art single model by up to 7.8% absolute overall F-1-score. The results also show that it obtains performance improvement by up to 4.7% absolute overall F-1-score against the previous best ensemble model.
引用
收藏
页码:7270 / 7274
页数:5
相关论文
共 50 条
  • [21] Acoustic model training using self-attention for low-resource speech recognition
    Park, Hosung
    Kim, Ji-Hwan
    JOURNAL OF THE ACOUSTICAL SOCIETY OF KOREA, 2020, 39 (05): : 483 - 489
  • [22] Efficient Self-Attention Model for Speech Recognition-Based Assistive Robots Control
    Poirier, Samuel
    Cote-Allard, Ulysse
    Routhier, Francois
    Campeau-Lecours, Alexandre
    SENSORS, 2023, 23 (13)
  • [23] Contextualized Word Representations for Self-Attention Network
    Essam, Mariam
    Eldawlatly, Seif
    Abbas, Hazem
    PROCEEDINGS OF 2018 13TH INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND SYSTEMS (ICCES), 2018, : 116 - 121
  • [24] Punctuation Restoration for Ukrainian Broadcast Speech Recognition System based on Bidirectional Recurrent Neural Network and Word Embeddings
    Sazhok, Mykola
    Poltieva, Anna
    Robeiko, Valentyna
    Seliukh, Ruslan
    Fedoryn, Dmytro
    COLINS 2021: COMPUTATIONAL LINGUISTICS AND INTELLIGENT SYSTEMS, VOL I, 2021, 2870
  • [25] Self-Attention ConvLSTM for Spatiotemporal Prediction
    Lin, Zhihui
    Li, Maomao
    Zheng, Zhuobin
    Cheng, Yangyang
    Yuan, Chun
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 11531 - 11538
  • [26] Pedestrian Trajectory Prediction Model Based on Self-Attention Mechanism and Group Behavior Characteristics
    Zhou Y.
    Wu H.
    Cheng H.
    Zheng J.
    Li X.
    Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University, 2020, 45 (12): : 1989 - 1996
  • [27] Research on seismic hydrocarbon prediction based on a self-attention semi-supervised model
    Jiang, Wenbin
    Zhang, Dongmei
    Kang, Zhijiang
    Hui, Gang
    Jiang, Xinwei
    GEOENERGY SCIENCE AND ENGINEERING, 2023, 226
  • [28] Design Resources Recommendation Based on Word Vectors and Self-Attention Mechanisms
    Sun Q.
    Deng C.
    Gu Z.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2024, 36 (01): : 63 - 72
  • [29] A Cross-Project Defect Prediction Model Based on Deep Learning With Self-Attention
    Wen, Wanzhi
    Zhang, Ruinian
    Wang, Chuyue
    Shen, Chenqiang
    Yu, Meng
    Zhang, Suchuan
    Gao, Xinxin
    IEEE ACCESS, 2022, 10 : 110385 - 110401
  • [30] Industrial Compressor-Monitoring Data Prediction Based on LSTM and Self-Attention Model
    Pu, Liming
    Zhang, Lin
    Liu, Jie
    Qiu, Limin
    PROCESSES, 2025, 13 (02)