SELF-ATTENTION AND RETRIEVAL ENHANCED NEURAL NETWORKS FOR ESSAY GENERATION

被引:0
|
作者
Wang, Wei [1 ]
Zheng, Hai-Tao [1 ]
Lin, Zibo [1 ]
机构
[1] Tsinghua Univ, Grad Sch Shenzhen, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
essay generation; natural language generation;
D O I
10.1109/icassp40776.2020.9052954
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In this paper, we focus on essay generation, which aims at generating an essay (a paragraph) according to a set of topic words. Automatic essay generation can be applied to many scenarios to reduce human workload. Recently the recurrent neural networks (RNN) based methods are proposed to solve this task. However, the RNN-based methods suffer from incoherence problem and duplication problem. To overcome these shortcomings, we propose a self-attention and retrieval enhanced neural network for essay generation. We retrieve sentences relevant to topic words from corpus as material to assist in generation to alleviate the duplication problem. To improve the coherence of essays, the self-attention based encoders are applied to encode topic and material, and the self-attention based decoder are used to generate essay respectively. The final essay is generated under the guidance of topic and material. Experimental results on a real essay dataset show that our model outperforms state-of-the-art baselines according to automatic evaluation and human evaluation.
引用
收藏
页码:8199 / 8203
页数:5
相关论文
共 50 条
  • [41] Self-Attention Mechanism in GANs for Molecule Generation
    Chinnareddy, Sandeep
    Grandhi, Pranav
    Narayan, Apurva
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 57 - 60
  • [42] SELF-ATTENTION NEURAL BAG-OF-FEATURES
    Chumachenko, Kateryna
    Iosifidis, Alexandros
    Gabbouj, Moncef
    2022 IEEE 32ND INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2022,
  • [43] HIGSA: Human image generation with self-attention
    Wu, Haoran
    He, Fazhi
    Si, Tongzhen
    Duan, Yansong
    Yan, Xiaohu
    ADVANCED ENGINEERING INFORMATICS, 2023, 55
  • [44] LayoutTransformer: Layout Generation and Completion with Self-attention
    Gupta, Kamal
    Lazarow, Justin
    Achille, Alessandro
    Davis, Larry
    Mahadevan, Vijay
    Shrivastava, Abhinav
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 984 - 994
  • [45] Lite Vision Transformer with Enhanced Self-Attention
    Yang, Chenglin
    Wang, Yilin
    Zhang, Jianming
    Zhang, He
    Wei, Zijun
    Lin, Zhe
    Yuille, Alan
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 11988 - 11998
  • [46] Universal Graph Transformer Self-Attention Networks
    Dai Quoc Nguyen
    Tu Dinh Nguyen
    Dinh Phung
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 193 - 196
  • [47] Context-Aware Self-Attention Networks
    Yang, Baosong
    Li, Jian
    Wong, Derek F.
    Chao, Lidia S.
    Wang, Xing
    Tu, Zhaopeng
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 387 - 394
  • [48] Feature Importance Estimation with Self-Attention Networks
    Skrlj, Blaz
    Dzeroski, Saso
    Lavrac, Nada
    Petkovic, Matej
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1491 - 1498
  • [49] Improving Self-Attention Networks With Sequential Relations
    Zheng, Zaixiang
    Huang, Shujian
    Weng, Rongxiang
    Dai, Xinyu
    Chen, Jiajun
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2020, 28 : 1707 - 1716
  • [50] Encrypted JPEG Image Retrieval via Huffman-code Based Self-Attention Networks
    Lu, Zhixun
    Feng, Qihua
    Li, Peiya
    PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 1489 - 1494