Contextualized Word Representations for Self-Attention Network

被引:0
|
作者
Essam, Mariam [1 ]
Eldawlatly, Seif [1 ]
Abbas, Hazem [1 ]
机构
[1] Ain Shams Univ, Comp & Syst Engn Dept, Cairo, Egypt
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Transfer learning is one approach that could be used to better train deep neural networks. It plays a key role in initializing a network in computer vision applications as opposed to implementing a network from scratch which could he time-consuming. Natural Language Processing (NLP) shares a similar concept of transferring from large-scale data. Recent studies demonstrated that pretrained language models can be used to achieve state-of-the-art results on a multitude of NLP tasks such as sentiment analysis, machine translation and text summarization. In this paper, we demonstrate that a free RNN/CNN self attention model used for sentiment analysis can be improved with 2.53% by using contextualized word representation learned in a language modeling task.
引用
收藏
页码:116 / 121
页数:6
相关论文
共 50 条
  • [21] Self-attention recurrent network for saliency detection
    Sun, Fengdong
    Li, Wenhui
    Guan, Yuanyuan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2019, 78 (21) : 30793 - 30807
  • [22] Lightweight Self-Attention Network for Semantic Segmentation
    Zhou, Yan
    Zhou, Haibin
    Li, Nanjun
    Li, Jianxun
    Wang, Dongli
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [23] Self-Attention Based Network for Punctuation Restoration
    Wang, Feng
    Chen, Wei
    Yang, Zhen
    Xu, Bo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2803 - 2808
  • [24] QKSAN: A Quantum Kernel Self-Attention Network
    Zhao, Ren-Xin
    Shi, Jinjing
    Li, Xuelong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 10184 - 10195
  • [25] Crowd Counting Network with Self-attention Distillation
    Li, Yaoyao
    Wang, Li
    Zhao, Huailin
    Nie, Zhen
    JOURNAL OF ROBOTICS NETWORKING AND ARTIFICIAL LIFE, 2020, 7 (02): : 116 - 120
  • [26] Variational Self-attention Network for Sequential Recommendation
    Zhao, Jing
    Zhao, Pengpeng
    Zhao, Lei
    Liu, Yanchi
    Sheng, Victor S.
    Zhou, Xiaofang
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 1559 - 1570
  • [27] Self-attention recurrent network for saliency detection
    Fengdong Sun
    Wenhui Li
    Yuanyuan Guan
    Multimedia Tools and Applications, 2019, 78 : 30793 - 30807
  • [28] Crowd Counting Network with Self-attention Distillation
    Wang, Li
    Zhao, Huailin
    Nie, Zhen
    Li, Yaoyao
    PROCEEDINGS OF THE 2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL LIFE AND ROBOTICS (ICAROB2020), 2020, : 587 - 591
  • [29] Self-Attention Network for Human Pose Estimation
    Xia, Hailun
    Zhang, Tianyang
    APPLIED SCIENCES-BASEL, 2021, 11 (04): : 1 - 15
  • [30] Assessing the Ability of Self-Attention Networks to Learn Word Order
    Yang, Baosong
    Wang, Longyue
    Wong, Derek F.
    Chao, Lidia S.
    Tu, Zhaopeng
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3635 - 3644