SELF-ATTENTION AND RETRIEVAL ENHANCED NEURAL NETWORKS FOR ESSAY GENERATION

被引:0
|
作者
Wang, Wei [1 ]
Zheng, Hai-Tao [1 ]
Lin, Zibo [1 ]
机构
[1] Tsinghua Univ, Grad Sch Shenzhen, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
essay generation; natural language generation;
D O I
10.1109/icassp40776.2020.9052954
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In this paper, we focus on essay generation, which aims at generating an essay (a paragraph) according to a set of topic words. Automatic essay generation can be applied to many scenarios to reduce human workload. Recently the recurrent neural networks (RNN) based methods are proposed to solve this task. However, the RNN-based methods suffer from incoherence problem and duplication problem. To overcome these shortcomings, we propose a self-attention and retrieval enhanced neural network for essay generation. We retrieve sentences relevant to topic words from corpus as material to assist in generation to alleviate the duplication problem. To improve the coherence of essays, the self-attention based encoders are applied to encode topic and material, and the self-attention based decoder are used to generate essay respectively. The final essay is generated under the guidance of topic and material. Experimental results on a real essay dataset show that our model outperforms state-of-the-art baselines according to automatic evaluation and human evaluation.
引用
下载
收藏
页码:8199 / 8203
页数:5
相关论文
共 50 条
  • [1] Self-Attention Enhanced Recurrent Neural Networks for Sentence Classification
    Kumar, Ankit
    Rastogi , Reshma
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 905 - 911
  • [2] Original Music Generation using Recurrent Neural Networks with Self-Attention
    Jagannathan, Akash
    Chandrasekaran, Bharathi
    Dutta, Shubham
    Patil, Uma Rameshgouda
    Eirinaki, Magdalini
    2022 FOURTH IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE TESTING (AITEST 2022), 2022, : 56 - 63
  • [3] Continuous Self-Attention Models with Neural ODE Networks
    Zhang, Jing
    Zhang, Peng
    Kong, Baiwen
    Wei, Junqiu
    Jiang, Xin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14393 - 14401
  • [4] Quantum self-attention neural networks for text classification
    Li, Guangxi
    Zhao, Xuanqiang
    Wang, Xin
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (04)
  • [5] Quantum self-attention neural networks for text classification
    Guangxi LI
    Xuanqiang ZHAO
    Xin WANG
    Science China(Information Sciences), 2024, 67 (04) : 301 - 313
  • [6] Paragraph-level Neural Question Generation with Maxout Pointer and Gated Self-attention Networks
    Zhao, Yao
    Ni, Xiaochuan
    Ding, Yuanyuan
    Ke, Qifa
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3901 - 3910
  • [7] Deep relational self-Attention networks for scene graph generation
    Li, Ping
    Yu, Zhou
    Zhan, Yibing
    PATTERN RECOGNITION LETTERS, 2022, 153 : 200 - 206
  • [8] Deep relational self-Attention networks for scene graph generation
    Li, Ping
    Yu, Zhou
    Zhan, Yibing
    Pattern Recognition Letters, 2022, 153 : 200 - 206
  • [9] Untangling tradeoffs between recurrence and self-attention in neural networks
    Kerg, Giancarlo
    Kanuparthi, Bhargav
    Goyal, Anirudh
    Goyette, Kyle
    Bengio, Yoshua
    Lajoie, Guillaume
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [10] Convolutional Self-Attention Networks
    Yang, Baosong
    Wang, Longyue
    Wong, Derek F.
    Chao, Lidia S.
    Tu, Zhaopeng
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4040 - 4045