Natural Language Generation Using Sequential Models: A Survey

被引:7
|
作者
Pandey, Abhishek Kumar [1 ]
Roy, Sanjiban Sekhar [1 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Vellore 632014, Vellore, India
关键词
Natural language processing; Long term short-term memory; Natural language generation; Recurrent neural network; Sequential generative model; Story generation; TEXT GENERATION;
D O I
10.1007/s11063-023-11281-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Natural Language Generation (NLG) is one of the most critical yet challenging tasks in all Natural Language Processing applications. It is a process to automate text generation so that humans can understand its meaning. A handful of research articles published in the literature have described how NLG can produce understandable texts in various languages. The use of sequence-to-sequence modeling powered by deep learning techniques such as Long Term Short Term Memory, Recurrent Neural Networks, and Gated Recurrent Units has received much popularity as text generators. This survey provides a comprehensive overview of text generations and their related techniques, such as statistical, traditional, and neural network-based techniques. Generating text using the sequence-to-sequence model is not a simple task as it needs to handle continuous data, such as images, and discrete information, such as text. Therefore, in this study, we have identified some crucial areas for further research on text generation, such as incorporating a large text dataset, identifying and resolving grammatical errors, and generating extensive sentences or paragraphs. This work has also presented a detailed overview of the activation functions used in deep learning-based models and the evaluation metrics used for text generation.
引用
收藏
页码:7709 / 7742
页数:34
相关论文
共 50 条
  • [11] Planning-Based Models of Natural Language Generation
    Garoufi, Konstantina
    [J]. LANGUAGE AND LINGUISTICS COMPASS, 2014, 8 (01): : 1 - 10
  • [12] Automatic Generation of SBML Kinetic Models from Natural Language Texts Using GPT
    Maeda, Kazuhiro
    Kurata, Hiroyuki
    [J]. INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2023, 24 (08)
  • [13] A Survey of Natural Language-Based Editing of Low-Code Applications Using Large Language Models
    Gorissen, Simon Cornelius
    Sauer, Stefan
    Beckmann, Wolf G.
    [J]. HUMAN-CENTERED SOFTWARE ENGINEERING, HCSE 2024, 2024, 14793 : 243 - 254
  • [14] Natural language generation from Universal Dependencies using data augmentation and pre-trained language models
    Nguyen D.T.
    Tran T.
    [J]. International Journal of Intelligent Information and Database Systems, 2023, 16 (01) : 89 - 105
  • [15] Generation of Oracles using Natural Language Processing
    Leong, Iat Tou
    Barbosa, Raul
    [J]. 2021 28TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE WORKSHOPS (APSECW 2021), 2021, : 25 - 31
  • [16] An Interactive Scene Generation Using Natural Language
    Cheng, Yu
    Shi, Yan
    Sun, Zhiyong
    Feng, Dezhi
    Dong, Lixin
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 6957 - 6963
  • [17] Natural Language Dataset Generation Framework for Visualizations Powered by Large Language Models
    Ko, Hyung-Kwon
    Jeon, Hyeon
    Park, Gwanmo
    Kim, Dae Hyun
    Kim, Nam Wook
    Kim, Juho
    Seo, Jinwook
    [J]. PROCEEDINGS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYTEMS (CHI 2024), 2024,
  • [18] Pre-trained models for natural language processing: A survey
    Qiu XiPeng
    Sun TianXiang
    Xu YiGe
    Shao YunFan
    Dai Ning
    Huang XuanJing
    [J]. SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2020, 63 (10) : 1872 - 1897
  • [19] Pre-trained models for natural language processing: A survey
    QIU XiPeng
    SUN TianXiang
    XU YiGe
    SHAO YunFan
    DAI Ning
    HUANG XuanJing
    [J]. Science China Technological Sciences, 2020, 63 (10) : 1872 - 1897
  • [20] Pre-trained models for natural language processing: A survey
    QIU XiPeng
    SUN TianXiang
    XU YiGe
    SHAO YunFan
    DAI Ning
    HUANG XuanJing
    [J]. Science China(Technological Sciences), 2020, (10) : 1872 - 1897