Sequence-to-sequence Models for Cache Transition Systems

被引:0
|
作者
Peng, Xiaochang [1 ]
Song, Linfeng [1 ]
Gildea, Daniel [1 ]
Satta, Giorgio [2 ]
机构
[1] Univ Rochester, Rochester, NY 14627 USA
[2] Univ Padua, Padua, Italy
关键词
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs. We transform the sequence to graph mapping problem to a word sequence to transition action sequence problem using a special transition system called a cache transition system. To address the sparsity issue of neural AMR parsing, we feed feature embeddings from the transition state to provide relevant local information for each decoder state. We present a monotonic hard attention model for the transition framework to handle the strictly left-to-right alignment between each transition state and the current buffer input focus. We evaluate our neural transition model on the AMR parsing task, and our parser outperforms other sequence-to-sequence approaches and achieves competitive results in comparison with the best-performing models.(1)
引用
收藏
页码:1842 / 1852
页数:11
相关论文
共 50 条
  • [41] A NEW SEQUENCE-TO-SEQUENCE TRANSFORMATION
    CLARK, WD
    GRAY, HL
    SIAM REVIEW, 1969, 11 (04) : 648 - &
  • [42] EvoLSTM: context-dependent models of sequence evolution using a sequence-to-sequence LSTM
    Lim, Dongjoon
    Blanchette, Mathieu
    BIOINFORMATICS, 2020, 36 : 353 - 361
  • [43] Sequence-to-sequence self calibration
    Wolf, L
    Zomet, A
    COMPUTER VISION - ECCV 2002, PT II, 2002, 2351 : 370 - 382
  • [44] Sequence-to-Sequence Language Models for Character and Emotion Detection in Dream Narratives
    Cortal, Gustave
    TRAITEMENT AUTOMATIQUE DES LANGUES, 2024, 65 (01): : 11 - 35
  • [45] Linear Sequence-to-Sequence Alignment
    Padua, Flavio L. C.
    Carceroni, Rodrigo L.
    Santos, Geraldo A. M. R.
    Kutulakos, Kiriakos N.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (02) : 304 - 320
  • [46] Evaluating sequence-to-sequence models for simulating medical staff mobility on time
    Khovrichev, Mikhail A.
    Balakhontceva, Marina A.
    Ionov, Mikhail, V
    7TH INTERNATIONAL YOUNG SCIENTISTS CONFERENCE ON COMPUTATIONAL SCIENCE, YSC2018, 2018, 136 : 425 - 432
  • [47] Turkish abstractive text summarization using pretrained sequence-to-sequence models
    Baykara, Batuhan
    Gungor, Tunga
    NATURAL LANGUAGE ENGINEERING, 2023, 29 (05) : 1275 - 1304
  • [48] Sequence-to-sequence models for workload interference prediction on batch processing datacenters
    Buchaca, David
    Marcual, Joan
    LLuis Berral, Josep
    Carrera, David
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2020, 110 (110): : 155 - 166
  • [49] MacNet: Transferring Knowledge from Machine Comprehension to Sequence-to-Sequence Models
    Pan, Boyuan
    Yang, Yazheng
    Li, Hao
    Zhao, Zhou
    Zhuang, Yueting
    Cai, Deng
    He, Xiaofei
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [50] GreekT5: Sequence-to-Sequence Models for Greek News Summarization
    Giarelis, Nikolaos
    Mastrokostas, Charalampos
    Karacapilidis, Nikos
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, PT II, AIAI 2024, 2024, 712 : 60 - 73