Natural Language Generation Using Sequential Models: A Survey

被引:0
|
作者
Abhishek Kumar Pandey
Sanjiban Sekhar Roy
机构
[1] Vellore Institute of Technology,School of Computer Science and Engineering
[2] Vellore,undefined
来源
Neural Processing Letters | 2023年 / 55卷
关键词
Natural language processing; Long term short-term memory; Natural language generation; Recurrent neural network; Sequential generative model; Story generation;
D O I
暂无
中图分类号
学科分类号
摘要
Natural Language Generation (NLG) is one of the most critical yet challenging tasks in all Natural Language Processing applications. It is a process to automate text generation so that humans can understand its meaning. A handful of research articles published in the literature have described how NLG can produce understandable texts in various languages. The use of sequence-to-sequence modeling powered by deep learning techniques such as Long Term Short Term Memory, Recurrent Neural Networks, and Gated Recurrent Units has received much popularity as text generators. This survey provides a comprehensive overview of text generations and their related techniques, such as statistical, traditional, and neural network-based techniques. Generating text using the sequence-to-sequence model is not a simple task as it needs to handle continuous data, such as images, and discrete information, such as text. Therefore, in this study, we have identified some crucial areas for further research on text generation, such as incorporating a large text dataset, identifying and resolving grammatical errors, and generating extensive sentences or paragraphs. This work has also presented a detailed overview of the activation functions used in deep learning-based models and the evaluation metrics used for text generation.
引用
收藏
页码:7709 / 7742
页数:33
相关论文
共 50 条
  • [31] RECENT ADVANCES IN NATURAL LANGUAGE GENERATION: A SURVEY AND CLASSIFICATION OF THE EMPIRICAL LITERATURE
    Perera, Rivindu
    Nand, Parma
    COMPUTING AND INFORMATICS, 2017, 36 (01) : 1 - 32
  • [32] Bridging the Gap: A Survey on Integrating (Human) Feedback for Natural Language Generation
    Fernandes, Patrick
    Madaan, Aman
    Liu, Emmy
    Farinhas, Antonio
    Martins, Pedro Henrique
    Bertsch, Amanda
    de Souza, Jose G. C.
    Zhou, Shuyan
    Wu, Tongshuang
    Neubig, Graham
    Martins, Andre F. T.
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2023, 11 : 1643 - 1668
  • [33] A Survey on Using Gaze Behaviour for Natural Language Processing
    Mathias, Sandeep
    Kanojia, Diptesh
    Mishra, Abhijit
    Bhattacharya, Pushpak
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 4907 - 4913
  • [34] Using natural language generation in automatic route description
    Dale, R
    Geldof, S
    Prost, JP
    JOURNAL OF RESEARCH AND PRACTICE IN INFORMATION TECHNOLOGY, 2005, 37 (01): : 89 - 105
  • [35] Route description using natural language generation technology
    Zhang, XueYing
    INFORMATION RETRIEVAL TECHNOLOGY, 2008, 4993 : 454 - 459
  • [36] Validation of concept representation using natural language generation
    Baud, RH
    Rodrigues, JM
    Wagner, JC
    Rassinoux, AM
    Lovis, C
    Rush, P
    Trombert-Paviot, B
    Scherrer, JR
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 1997, : 841 - 841
  • [37] A Survey of Models for Constructing Text Features to Classify Texts in Natural Language
    Lagutina, Ksenia
    Lagutina, Nadezhda
    PROCEEDINGS OF THE 2021 29TH CONFERENCE OF OPEN INNOVATIONS ASSOCIATION (FRUCT), VOL 1, 2021, : 222 - 233
  • [38] A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models
    Zhang, Hanqing
    Song, Haolin
    Li, Shaoyu
    Zhou, Ming
    Song, Dawei
    ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [39] Using Language Models for Enhancing the Completeness of Natural-Language Requirements
    Luitel, Dipeeka
    Hassani, Shabnam
    Sabetzadeh, Mehrdad
    REQUIREMENTS ENGINEERING: FOUNDATION FOR SOFTWARE QUALITY, REFSQ 2023, 2023, 13975 : 87 - 104
  • [40] Highly-Inflected Language Generation Using Factored Language Models
    de Novais, Eder Miranda
    Paraboni, Ivandre
    Ferreira, Diogo Takaki
    COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, PT I, 2011, 6608 : 429 - 438