Sentiment Analysis with Contextual Embeddings and Self-attention

被引:1
|
作者
Biesialska, Katarzyna [1 ]
Biesialska, Magdalena [1 ]
Rybinski, Henryk [2 ]
机构
[1] Univ Politecn Cataluna, Barcelona, Spain
[2] Warsaw Univ Technol, Warsaw, Poland
关键词
Sentiment classification; Deep learning; Word embeddings;
D O I
10.1007/978-3-030-59491-6_4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In natural language the intended meaning of a word or phrase is often implicit and depends on the context. In this work, we propose a simple yet effective method for sentiment analysis using contextual embeddings and a self-attention mechanism. The experimental results for three languages, including morphologically rich Polish and German, show that our model is comparable to or even outperforms state-of-the-art models. In all cases the superiority of models leveraging contextual embeddings is demonstrated. Finally, this work is intended as a step towards introducing a universal, multilingual sentiment classifier.
引用
收藏
页码:32 / 41
页数:10
相关论文
共 50 条
  • [41] Latent Positional Information is in the Self-Attention Variance of Transformer Language ModelsWithout Positional Embeddings
    Chi, Ta-Chung
    Fan, Ting-Han
    Chen, Li-Wei
    Rudnicky, Alexander I.
    Ramadge, Peter J.
    61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1183 - 1193
  • [42] SparseBERT: Rethinking the Importance Analysis in Self-attention
    Shi, Han
    Gao, Jiahui
    Ren, Xiaozhe
    Xu, Hang
    Liang, Xiaodan
    Li, Zhenguo
    Kwok, James T.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [43] Self-attention presents low-dimensional knowledge graph embeddings for link prediction
    Baghershahi, Peyman
    Hosseini, Reshad
    Moradi, Hadi
    KNOWLEDGE-BASED SYSTEMS, 2023, 260
  • [44] SHYNESS AND SELF-ATTENTION
    CROZIER, WR
    BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
  • [45] Multi-entity sentiment analysis using self-attention based hierarchical dilated convolutional neural network
    Gan, Chenquan
    Wang, Lu
    Zhang, Zufan
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2020, 112 : 116 - 125
  • [46] Bidirectional LSTM with self-attention mechanism and multi-channel features for sentiment classification
    Li, Weijiang
    Qi, Fang
    Tang, Ming
    Yu, Zhengtao
    NEUROCOMPUTING, 2020, 387 : 63 - 77
  • [47] Attention and self-attention in random forests
    Utkin, Lev V.
    Konstantinov, Andrei V.
    Kirpichenko, Stanislav R.
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2023, 12 (03) : 257 - 273
  • [48] Attention and self-attention in random forests
    Lev V. Utkin
    Andrei V. Konstantinov
    Stanislav R. Kirpichenko
    Progress in Artificial Intelligence, 2023, 12 : 257 - 273
  • [49] Learning Contextual Features with Multi-head Self-attention for Fake News Detection
    Wang, Yangqian
    Han, Hao
    Ding, Ye
    Wang, Xuan
    Liao, Qing
    COGNITIVE COMPUTING - ICCC 2019, 2019, 11518 : 132 - 142
  • [50] Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification
    Wu, Xin
    Cai, Yi
    Li, Qing
    Xu, Jingyun
    Leung, Ho-fung
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 453 - 467