Plan-then-Generate: Controlled Data-to-Text Generation via Planning

被引:0
|
作者
Su, Yixuan [1 ,2 ]
Vandyke, David [2 ]
Wang, Sihui [2 ]
Fang, Yimai [2 ]
Collier, Nigel [1 ]
机构
[1] Univ Cambridge, Language Technol Lab, Cambridge, England
[2] Apple, Cupertino, CA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent developments in neural networks have led to the advance in data-to-text generation. However, the lack of ability of neural models to control the structure of generated output can be limiting in certain real-world applications. In this study, we propose a novel Plan-then-Generate (PlanGen) framework to improve the controllability of neural data-totext models. Extensive experiments and analyses are conducted on two benchmark datasets, ToTTo and WebNLG. The results show that our model is able to control both the intrasentence and inter-sentence structure of the generated output. Furthermore, empirical comparisons against previous state-of-the-art methods show that our model improves the generation quality as well as the output diversity as judged by human and automatic evaluations.
引用
收藏
页码:895 / 909
页数:15
相关论文
共 50 条
  • [1] Data-to-text Generation with Macro Planning
    Puduppully, Ratish
    Lapata, Mirella
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 510 - 527
  • [2] Data-to-text Generation with Variational Sequential Planning
    Puduppully, Ratish
    Fu, Yao
    Lapata, Mirella
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2022, 10 : 697 - 715
  • [3] Data-to-Text Generation with Content Selection and Planning
    Puduppully, Ratish
    Dong, Li
    Lapata, Mirella
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 6908 - 6915
  • [4] Neural Data-to-Text Generation Guided by Predicted Plan
    Gao, Hanning
    Wei, Zhihua
    [J]. 2022 IEEE 2ND INTERNATIONAL CONFERENCE ON INFORMATION COMMUNICATION AND SOFTWARE ENGINEERING (ICICSE 2022), 2022, : 53 - 59
  • [5] TWT: Table with Written Text for Controlled Data-to-Text Generation
    Li, Tongliang
    Fang, Lei
    Lou, Jian-Guang
    Li, Zhoujun
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1244 - 1254
  • [6] Neural data-to-text generation with dynamic content planning
    Chen, Kai
    Li, Fayuan
    Hu, Baotian
    Peng, Weihua
    Chen, Qingcai
    Yu, Hong
    Xiang, Yang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2021, 215
  • [7] A Data-to-Text Generation Model with Deduplicated Content Planning
    Wang, Mengda
    Cao, Jianjun
    Yu, Xu
    Nie, Zibo
    [J]. BIG DATA, BIGDATA 2022, 2022, 1709 : 92 - 103
  • [8] Learning to Select, Track, and Generate for Data-to-Text
    Iso, Hayate
    Uehara, Yui
    Ishigaki, Tatsuya
    Noji, Hiroshi
    Aramaki, Eiji
    Kobayashi, Ichiro
    Miyao, Yusuke
    Okazaki, Naoaki
    Takamura, Hiroya
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 2102 - 2113
  • [9] Syntax and Data-to-Text Generation
    Gardent, Claire
    [J]. STATISTICAL LANGUAGE AND SPEECH PROCESSING, SLSP 2014, 2014, 8791 : 3 - 20
  • [10] A Case-Based Approach for Content Planning in Data-to-Text Generation
    Upadhyay, Ashish
    Massie, Stewart
    [J]. CASE-BASED REASONING RESEARCH AND DEVELOPMENT, ICCBR 2022, 2022, 13405 : 380 - 394