Contextualized Word Representations for Self-Attention Network

被引:0
|
作者
Essam, Mariam [1 ]
Eldawlatly, Seif [1 ]
Abbas, Hazem [1 ]
机构
[1] Ain Shams Univ, Comp & Syst Engn Dept, Cairo, Egypt
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Transfer learning is one approach that could be used to better train deep neural networks. It plays a key role in initializing a network in computer vision applications as opposed to implementing a network from scratch which could he time-consuming. Natural Language Processing (NLP) shares a similar concept of transferring from large-scale data. Recent studies demonstrated that pretrained language models can be used to achieve state-of-the-art results on a multitude of NLP tasks such as sentiment analysis, machine translation and text summarization. In this paper, we demonstrate that a free RNN/CNN self attention model used for sentiment analysis can be improved with 2.53% by using contextualized word representation learned in a language modeling task.
引用
收藏
页码:116 / 121
页数:6
相关论文
共 50 条
  • [41] Diversifying Search Results using Self-Attention Network
    Qin, Xubo
    Dou, Zhicheng
    Wen, Ji-Rong
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 1265 - 1274
  • [42] Hierarchical Self-Attention Network for Action Localization in Videos
    Pramono, Rizard Renanda Adhi
    Chen, Yie-Tarng
    Fang, Wen-Hsien
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 61 - 70
  • [43] Self-attention Based Collaborative Neural Network for Recommendation
    Ma, Shengchao
    Zhu, Jinghua
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2019, 2019, 11604 : 235 - 246
  • [44] A pagerank self-attention network for traffic flow prediction
    Kang, Ting
    Wang, Huaizhi
    Wu, Ting
    Peng, Jianchun
    Jiang, Hui
    FRONTIERS IN ENERGY RESEARCH, 2022, 10
  • [45] Contextualized word senses: from attention to compositionality
    Gamallo, Pablo
    LINGUISTICS VANGUARD, 2024, 9 (01): : 191 - 203
  • [46] Multilayer self-attention residual network for code search
    Hu, Haize
    Liu, Jianxun
    Zhang, Xiangping
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (09):
  • [47] Multiple Self-attention Network for Intracranial Vessel Segmentation
    Li, Yang
    Ni, Jiajia
    Elazab, Ahmed
    Wu, Jianhuang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [48] Self-attention feature fusion network for semantic segmentation
    Zhou, Zhen
    Zhou, Yan
    Wang, Dongli
    Mu, Jinzhen
    Zhou, Haibin
    NEUROCOMPUTING, 2021, 453 : 50 - 59
  • [49] Self-Attention based Network For Medical Query Expansion
    Chen, Su
    Hu, Qinmin Vivian
    Song, Yang
    He, Yun
    Wu, Huaying
    He, Liang
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [50] Episodic Memory Network with Self-attention for Emotion Detection
    Huang, Jiangping
    Lin, Zhong
    Liu, Xin
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, 2019, 11448 : 220 - 224