LEARNING LATENT VARIABLE GRAMMARS FROM COMPLEMENTARY PERSPECTIVES

被引:0
|
作者
Li, Dongchen [1 ]
Zhang, Xiantao [1 ]
Wu, Xihong [1 ]
机构
[1] Peking Univ, Speech & Hearing Res Ctr, Key Lab Machine Percept & Intelligence, Beijing 100871, Peoples R China
关键词
Parsing; sparsity; PCFGLA;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The corpus for training a parser consists of sentences of heterogeneous grammar usages. Previous parser domain adaptation work has concentrated on adaptation to the shifts in vocabulary rather than grammar usage. In this paper, we focus on exploiting the diversity of training date separately and then accumulates their advantages. We propose an approach that grammar is biased toward relevant syntactic style, and the complementary grammar usage are combined for inference. Multiple grammars with partly complementary points of strength are induced individually. They capture complementary data representation, and we accumulates their advantages in a joint model to assemble the complementary depicting powers. Despite its compatibility with many other methods, out product model achieves 85.20% F-1 score on Penn Chinese Treebank, higher than previous systems.
引用
收藏
页码:124 / 128
页数:5
相关论文
共 50 条
  • [21] Learning Sequential Latent Variable Models from Multimodal Time Series Data
    Limoyo, Oliver
    Ablett, Trevor
    Kelly, Jonathan
    INTELLIGENT AUTONOMOUS SYSTEMS 17, IAS-17, 2023, 577 : 511 - 528
  • [22] Gaussian Mixture Latent Vector Grammars
    Zhao, Yanpeng
    Zhang, Liwen
    Tu, Kewei
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1181 - 1189
  • [23] A Simple Latent Variable Model for Graph Learning and Inference
    Jaeger, Manfred
    Longa, Antonio
    Azzolin, Steve
    Schulte, Oliver
    Passerini, Andrea
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [24] Manifold Learning for Latent Variable Inference in Dynamical Systems
    Talmon, Ronen
    Mallat, Stephane
    Zaveri, Hitten
    Coifman, Ronald R.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (15) : 3843 - 3856
  • [25] A Latent Variable Model for Learning Distributional Relation Vectors
    Camacho-Collados, Jose
    Espinosa-Anke, Luis
    Jameel, Shoaib
    Schockaert, Steven
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4911 - 4917
  • [26] Meta Reinforcement Learning with Latent Variable Gaussian Processes
    Saemundsson, Steindor
    Hofmann, Katja
    Deisenroth, Marc Peter
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 642 - 652
  • [27] Latent Variable Model for Learning in Pairwise Markov Networks
    Amizadeh, Saeed
    Hauskrecht, Milos
    PROCEEDINGS OF THE TWENTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-10), 2010, : 382 - 387
  • [28] Online Tensor Methods for Learning Latent Variable Models
    Huang, Furong
    Niranjan, U. N.
    Hakeem, Mohammad Umar
    Anandkumar, Animashree
    JOURNAL OF MACHINE LEARNING RESEARCH, 2015, 16 : 2797 - 2835
  • [29] Learning continuous latent variable models with Bregman divergences
    Wang, SJ
    Schuurmans, D
    ALGORITHMIC LEARNING THEORY, PROCEEDINGS, 2003, 2842 : 190 - 204
  • [30] Langevin Autoencoders for Learning Deep Latent Variable Models
    Taniguchi, Shohei
    Iwasawa, Yusuke
    Kumagai, Wataru
    Matsuo, Yutaka
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,