A Window-Based Self-Attention approach for sentence encoding

被引:14
|
作者
Huang, Ting [1 ]
Deng, Zhi-Hong [1 ]
Shen, Gehui [1 ]
Chen, Xi [1 ]
机构
[1] Peking Univ, Sch Elect Engn & Comp Sci, Minist Educ, Key Lab Machine Percept, Beijing, Peoples R China
关键词
Window; Attention; Word embedding; Sentence encoding;
D O I
10.1016/j.neucom.2019.09.024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, much progress has been made in the representation of sentences for the task of natural language processing (NLP). Most of existing methods utilize deep learning methods with RNN/CNN to capture contextual information. These models typically treat each word in a sentence equally, which ignores the fact that keywords always play a leading role in expressing the sentence semantic. Especially, on the tasks of sentence classification and semantic relatedness prediction, we can judge through the information of several keywords, no need for the whole sentence. To this end, we propose a window-based intra-weighing approach to weigh words in the sentence. In calculating attentive weights, we take as input multi-window n-grams and use max-pooling for feature extraction. To evaluate our model, we conduct experiments on 6 benchmark data sets. The experimental results demonstrate the superiority of our model in calculating word importance. Although our model has fewer parameters and much lower computational complexity than state-of-the-art models, it achieves comparable results with them. (C) 2019 Published by Elsevier B.V.
引用
收藏
页码:25 / 31
页数:7
相关论文
共 50 条
  • [41] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Khamesian, Saman
    Malek, Hamed
    EVOLVING SYSTEMS, 2024, 15 (02) : 489 - 503
  • [42] CopyBERT: A Unified Approach to Question Generation with Self-Attention
    Varanasi, Stalin
    Amin, Saadullah
    Neumann, Guenter
    NLP FOR CONVERSATIONAL AI, 2020, : 25 - 31
  • [43] Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs
    Saman Khamesian
    Hamed Malek
    Evolving Systems, 2024, 15 : 489 - 503
  • [44] Individualized tourism recommendation based on self-attention
    Liu, Guangjie
    Ma, Xin
    Zhu, Jinlong
    Zhang, Yu
    Yang, Danyang
    Wang, Jianfeng
    Wang, Yi
    PLOS ONE, 2022, 17 (08):
  • [45] SHYNESS AND SELF-ATTENTION
    CROZIER, WR
    BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
  • [46] CONSTRUCTION AND EVALUATION OF A SELF-ATTENTION MODEL FOR SEMANTIC UNDERSTANDING OF SENTENCE-FINAL PARTICLES
    Mandokoro, Shuhei
    Oka, Natsuki
    Matsushima, Akane
    Fukada, Chie
    Yoshimura, Yuko
    Kawahara, Koji
    Tanaka, Kazuaki
    arXiv, 2022,
  • [47] WINDOW-BASED SURVEILLANCE STRATEGIES
    KRISHNA, CM
    GANZ, A
    WANG, X
    IEE PROCEEDINGS-COMPUTERS AND DIGITAL TECHNIQUES, 1995, 142 (03): : 233 - 236
  • [48] WSAFormer-DFFN: A model for rotating machinery fault diagnosis using 1D window-based multi-head self-attention and deep feature fusion network
    Wei, Qingzhe
    Tian, Xincheng
    Cui, Long
    Zheng, Fuquan
    Liu, Lida
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 124
  • [49] Named Entity Recognition in Persian Language based on Self-attention Mechanism with Weighted Relational Position Encoding
    Ganjalipour, Ebrahim
    Sheikhani, Amir Hossein Refahi
    Kordrostami, Sohrab
    Hosseinzadeh, Ali Asghar
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (12)
  • [50] Performance analysis and optimization of the window-based least significant bits encoding technique of ROHC
    Kalyanasundaram, Suresh
    Ramachandran, Vinod
    Collins, Laurange Mariasoosei
    GLOBECOM 2007: 2007 IEEE GLOBAL TELECOMMUNICATIONS CONFERENCE, VOLS 1-11, 2007, : 4681 - 4686