Long Short-Term Attention

被引:3
|
作者
Zhong, Guoqiang [1 ]
Lin, Xin [1 ]
Chen, Kang [1 ]
Li, Qingyang [1 ]
Huang, Kaizhu [2 ]
机构
[1] Ocean Univ China, Dept Comp Sci & Technol, Qingdao 266100, Peoples R China
[2] Xian Jiaotong Liverpool Univ, Dept Elect & Elect Engn, SIP, Suzhou 215123, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Machine learning; Sequence learning; Attention mechanism; Long short-term memory; Long short-term attention; BIDIRECTIONAL LSTM; SALIENCY DETECTION; BOTTOM-UP; FRAMEWORK;
D O I
10.1007/978-3-030-39431-8_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Attention is an important cognition process of humans, which helps humans concentrate on critical information during their perception and learning. However, although many machine learning models can remember information of data, they have no the attention mechanism. For example, the long short-term memory (LSTM) network is able to remember sequential information, but it cannot pay special attention to part of the sequences. In this paper, we present a novel model called long short-term attention (LSTA), which seamlessly integrates the attention mechanism into the inner cell of LSTM. More than processing long short term dependencies, LSTA can focus on important information of the sequences with the attention mechanism. Extensive experiments demonstrate that LSTA outperforms LSTM and related models on the sequence learning tasks.
引用
收藏
页码:45 / 54
页数:10
相关论文
共 50 条
  • [21] A SHORT-TERM IN LONG-TERM
    KELLY, FL
    AMERICAN JOURNAL OF NURSING, 1988, 88 (11) : 1479 - 1480
  • [22] Short-Term Photovoltaic Power Forecasting Based on Long Short Term Memory Neural Network and Attention Mechanism
    Zhou, Hangxia
    Zhang, Yujin
    Yang, Lingfan
    Liu, Qian
    Yan, Ke
    Du, Yang
    IEEE ACCESS, 2019, 7 : 78063 - 78074
  • [23] Hierarchical attention based long short-term memory for Chinese lyric generation
    Wu, Xing
    Du, Zhikang
    Guo, Yike
    Fujita, Hamido
    APPLIED INTELLIGENCE, 2019, 49 (01) : 44 - 52
  • [24] Long- and short-term self-attention network for sequential recommendation
    Xu, Chengfeng
    Feng, Jian
    Zhao, Pengpeng
    Zhuang, Fuzhen
    Wang, Deqing
    Liu, Yanchi
    Sheng, Victor S.
    NEUROCOMPUTING, 2021, 423 : 580 - 589
  • [25] Predicting maintenance through an attention long short-term memory projected model
    Shih-Hsien Tseng
    Khoa-Dang Tran
    Journal of Intelligent Manufacturing, 2024, 35 : 807 - 824
  • [26] Predicting maintenance through an attention long short-term memory projected model
    Tseng, Shih-Hsien
    Tran, Khoa-Dang
    JOURNAL OF INTELLIGENT MANUFACTURING, 2024, 35 (02) : 807 - 824
  • [27] Short-term Load Forecasting with Distributed Long Short-Term Memory
    Dong, Yi
    Chen, Yang
    Zhao, Xingyu
    Huang, Xiaowei
    2023 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE, ISGT, 2023,
  • [28] Temporal Precursor Discovery Using Long Short-Term Memory with Feature Attention
    Deng, Chuhao
    Choi, Hong-Cheol
    Park, Hyunsang
    Hwang, Inseok
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2024, 21 (02): : 178 - 191
  • [29] Hierarchical attention based long short-term memory for Chinese lyric generation
    Xing Wu
    Zhikang Du
    Yike Guo
    Hamido Fujita
    Applied Intelligence, 2019, 49 : 44 - 52
  • [30] MALICIOUS LOGIN DETECTION USING LONG SHORT-TERM MEMORY WITH AN ATTENTION MECHANISM
    Wu, Yanna
    Liu, Fucheng
    Wen, Yu
    ADVANCES IN DIGITAL FORENSICS XVII, 2021, 612 : 157 - 173