Natural Language Generation Using Sequential Models: A Survey

被引:0
|
作者
Abhishek Kumar Pandey
Sanjiban Sekhar Roy
机构
[1] Vellore Institute of Technology,School of Computer Science and Engineering
[2] Vellore,undefined
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Natural language processing; Long term short-term memory; Natural language generation; Recurrent neural network; Sequential generative model; Story generation;
D O I
暂无
中图分类号
学科分类号
摘要
Natural Language Generation (NLG) is one of the most critical yet challenging tasks in all Natural Language Processing applications. It is a process to automate text generation so that humans can understand its meaning. A handful of research articles published in the literature have described how NLG can produce understandable texts in various languages. The use of sequence-to-sequence modeling powered by deep learning techniques such as Long Term Short Term Memory, Recurrent Neural Networks, and Gated Recurrent Units has received much popularity as text generators. This survey provides a comprehensive overview of text generations and their related techniques, such as statistical, traditional, and neural network-based techniques. Generating text using the sequence-to-sequence model is not a simple task as it needs to handle continuous data, such as images, and discrete information, such as text. Therefore, in this study, we have identified some crucial areas for further research on text generation, such as incorporating a large text dataset, identifying and resolving grammatical errors, and generating extensive sentences or paragraphs. This work has also presented a detailed overview of the activation functions used in deep learning-based models and the evaluation metrics used for text generation.
引用
收藏
页码:7709 / 7742
页数:33
相关论文
共 50 条
  • [21] Pre-trained models for natural language processing: A survey
    QIU XiPeng
    SUN TianXiang
    XU YiGe
    SHAO YunFan
    DAI Ning
    HUANG XuanJing
    Science China(Technological Sciences), 2020, (10) : 1872 - 1897
  • [22] Pre-trained models for natural language processing: A survey
    XiPeng Qiu
    TianXiang Sun
    YiGe Xu
    YunFan Shao
    Ning Dai
    XuanJing Huang
    Science China Technological Sciences, 2020, 63 : 1872 - 1897
  • [23] Application and Evaluation of Large Language Models for the Generation of Survey Questions
    Maiorino, Antonio
    Padgett, Zoe
    Wang, Chun
    Yakubovskiy, Misha
    Jiang, Peng
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 5244 - 5245
  • [24] Pre-Trained Language Models for Text Generation: A Survey
    Li, Junyi
    Tang, Tianyi
    Zhao, Wayne Xin
    Nie, Jian-Yun
    Wen, Ji-Rong
    ACM COMPUTING SURVEYS, 2024, 56 (09)
  • [25] Automatic Generation of UTP Models from Requirements in Natural Language
    Masuda, Satoshi
    Matsuodani, Tohru
    Tsuda, Kazuhiko
    2016 IEEE NINTH INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION WORKSHOPS (ICSTW), 2016, : 1 - 6
  • [26] Stochastic Language Generation in Dialogue Using Factored Language Models
    Mairesse, Francois
    Young, Steve
    COMPUTATIONAL LINGUISTICS, 2014, 40 (04) : 763 - 799
  • [27] A review of the generation of requirements specification in natural language using objects UML models and domain ontology
    Abdalazeim, Alaa
    Meziane, Farid
    AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 328 - 334
  • [28] Survey on Spell Checker for Tamil Language Using Natural Language Processing
    Selvaraj, P. A.
    Jagadeesan, M.
    Harikrishnan, M.
    Vijayapriya, R.
    Jayasudha, K.
    JOURNAL OF PHARMACEUTICAL NEGATIVE RESULTS, 2022, 13 : 170 - 174
  • [29] Neural Natural Language Generation: A Survey on Multilinguality, Multimodality, Controllability and Learning
    Erdem, Erkut
    Kuyu, Menekse
    Yagcioglu, Semih
    Frank, Anette
    Parcalabescu, Letitia
    Babii, Andrii
    Turuta, Oleksii
    Erdem, Aykut
    Calixto, Iacer
    Plank, Barbara
    Lloret, Elena
    Apostol, Elena-Simona
    Truicǎ, Ciprian-Octavian
    Šandrih, Branislava
    Martinčić-Ipšić, Sanda
    Berend, Gábor
    Gatt, Albert
    Korvel, Gražina
    Journal of Artificial Intelligence Research, 2022, 73 : 1131 - 1207
  • [30] Neural Natural Language Generation: A Survey on Multilinguality, Multimodality, Controllability and Learning
    Erdem, Erkut
    Kuyu, Menekse
    Yagcioglu, Semih
    Frank, Anette
    Parcalabescu, Letitia
    Babii, Andrii
    Turuta, Oleksii
    Erdem, Aykut
    Calixto, Lacer
    Plank, Barbara
    Lloret, Elena
    Apostol, Elena-Simona
    Truica, Ciprian-Octavian
    Sandrih, Branislava
    Martincic-Ipsic, Sanda
    Berend, Gabor
    Gatt, Albert
    Korvel, Grazina
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2022, 73 : 1131 - 1207