A Window-Based Self-Attention approach for sentence encoding

被引:14
|
作者
Huang, Ting [1 ]
Deng, Zhi-Hong [1 ]
Shen, Gehui [1 ]
Chen, Xi [1 ]
机构
[1] Peking Univ, Sch Elect Engn & Comp Sci, Minist Educ, Key Lab Machine Percept, Beijing, Peoples R China
关键词
Window; Attention; Word embedding; Sentence encoding;
D O I
10.1016/j.neucom.2019.09.024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, much progress has been made in the representation of sentences for the task of natural language processing (NLP). Most of existing methods utilize deep learning methods with RNN/CNN to capture contextual information. These models typically treat each word in a sentence equally, which ignores the fact that keywords always play a leading role in expressing the sentence semantic. Especially, on the tasks of sentence classification and semantic relatedness prediction, we can judge through the information of several keywords, no need for the whole sentence. To this end, we propose a window-based intra-weighing approach to weigh words in the sentence. In calculating attentive weights, we take as input multi-window n-grams and use max-pooling for feature extraction. To evaluate our model, we conduct experiments on 6 benchmark data sets. The experimental results demonstrate the superiority of our model in calculating word importance. Although our model has fewer parameters and much lower computational complexity than state-of-the-art models, it achieves comparable results with them. (C) 2019 Published by Elsevier B.V.
引用
收藏
页码:25 / 31
页数:7
相关论文
共 50 条
  • [1] Phrase-level Self-Attention Networks for Universal Sentence Encoding
    Wu, Wei
    Wang, Houfeng
    Liu, Tianyu
    Ma, Shuming
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3729 - 3738
  • [2] WiSeBE: Window-Based Sentence Boundary Evaluation
    Gonzalez-Gallardo, Carlos-Emiliano
    Torres-Moreno, Juan-Manuel
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, MICAI 2018, PT II, 2018, 11289 : 119 - 131
  • [3] A sentence sentiment classification method based on Self-supervised and Self-attention
    Xiao, Jianqiong
    Zhou, Zhiyong
    PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020), 2020, : 1139 - 1143
  • [4] Enhanced Window-Based Self-Attention with Global and Multi-Scale Representations for Remote Sensing Image Super-Resolution
    Lu, Yuting
    Wang, Shunzhou
    Wang, Binglu
    Zhang, Xin
    Wang, Xiaoxu
    Zhao, Yongqiang
    REMOTE SENSING, 2024, 16 (15)
  • [5] INTEGRATING DEPENDENCY TREE INTO SELF-ATTENTION FOR SENTENCE REPRESENTATION
    Ma, Junhua
    Li, Jiajun
    Liu, Yuxuan
    Zhou, Shangbo
    Li, Xue
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8137 - 8141
  • [6] Sentence Matching with Deep Self-attention and Co-attention Features
    Wang, Zhipeng
    Yan, Danfeng
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 550 - 561
  • [7] Self-Attention Encoding and Pooling for Speaker Recognition
    Safari, Pooyan
    India, Miquel
    Hernando, Javier
    INTERSPEECH 2020, 2020, : 941 - 945
  • [8] The Devil Is in the Details: Window-based Attention for Image Compression
    Zou, Renjie
    Song, Chunfeng
    Zhang, Zhaoxiang
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 17471 - 17480
  • [9] NATURAL IMAGE MATTING WITH SHIFTED WINDOW SELF-ATTENTION
    Wang, Zhikun
    Liu, Yang
    Li, Zonglin
    Wang, Chenyang
    Zhang, Shengping
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 2911 - 2915
  • [10] Self-Attention Enhanced Recurrent Neural Networks for Sentence Classification
    Kumar, Ankit
    Rastogi , Reshma
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 905 - 911