A Window-Based Self-Attention approach for sentence encoding

被引:14
|
作者
Huang, Ting [1 ]
Deng, Zhi-Hong [1 ]
Shen, Gehui [1 ]
Chen, Xi [1 ]
机构
[1] Peking Univ, Sch Elect Engn & Comp Sci, Minist Educ, Key Lab Machine Percept, Beijing, Peoples R China
关键词
Window; Attention; Word embedding; Sentence encoding;
D O I
10.1016/j.neucom.2019.09.024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, much progress has been made in the representation of sentences for the task of natural language processing (NLP). Most of existing methods utilize deep learning methods with RNN/CNN to capture contextual information. These models typically treat each word in a sentence equally, which ignores the fact that keywords always play a leading role in expressing the sentence semantic. Especially, on the tasks of sentence classification and semantic relatedness prediction, we can judge through the information of several keywords, no need for the whole sentence. To this end, we propose a window-based intra-weighing approach to weigh words in the sentence. In calculating attentive weights, we take as input multi-window n-grams and use max-pooling for feature extraction. To evaluate our model, we conduct experiments on 6 benchmark data sets. The experimental results demonstrate the superiority of our model in calculating word importance. Although our model has fewer parameters and much lower computational complexity than state-of-the-art models, it achieves comparable results with them. (C) 2019 Published by Elsevier B.V.
引用
收藏
页码:25 / 31
页数:7
相关论文
共 50 条
  • [31] Revolutionizing sentiment classification: A deep learning approach using self-attention based encoding-decoding transformers with feature fusion
    Tejashwini, S. G.
    Aradhana, D.
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 125
  • [32] GCNSA: DNA storage encoding with a graph convolutional network and self-attention
    Cao, Ben
    Wang, Bin
    Zhang, Qiang
    ISCIENCE, 2023, 26 (03)
  • [33] Progressive Self-Attention Network with Unsymmetrical Positional Encoding for Sequential Recommendation
    Zhu, Yuehua
    Huang, Bo
    Jiang, Shaohua
    Yang, Muli
    Yang, Yanhua
    Zhong, Wenliang
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2029 - 2033
  • [34] A Novel Cross Channel Self-Attention based Approach for Facial Attribute Editing
    Xu, Meng
    Jin, Rize
    Lu, Liangfu
    Chung, Tae-Sun
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2021, 15 (06) : 2115 - 2127
  • [35] Explanatory prediction of traffic congestion propagation mode: A self-attention based approach
    Liu, Qingchao
    Liu, Tao
    Cai, Yingfeng
    Xiong, Xiaoxia
    Jiang, Haobin
    Wang, Hai
    Hu, Ziniu
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2021, 573
  • [36] Unsupervised Pansharpening Based on Self-Attention Mechanism
    Qu, Ying
    Baghbaderani, Razieh Kaviani
    Qi, Hairong
    Kwan, Chiman
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2021, 59 (04): : 3192 - 3208
  • [37] Keyphrase Generation Based on Self-Attention Mechanism
    Yang, Kehua
    Wang, Yaodong
    Zhang, Wei
    Yao, Jiqing
    Le, Yuquan
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (02): : 569 - 581
  • [38] Self-Attention Based Network for Punctuation Restoration
    Wang, Feng
    Chen, Wei
    Yang, Zhen
    Xu, Bo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2803 - 2808
  • [39] Session-Based Recommendation with Self-Attention
    Anh, Pharr Hoang
    Bach, Ngo Xuan
    Phuong, Tu Minh
    SOICT 2019: PROCEEDINGS OF THE TENTH INTERNATIONAL SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY, 2019, : 1 - 8
  • [40] A window-based multi-scale attention model for slope collapse detection
    Pan, Yuchen
    Xu, Hao
    Qian, Kui
    Li, Zhengyan
    Yan, Hong
    EARTH SCIENCE INFORMATICS, 2024, 17 (01) : 181 - 191