Character-Level Quantum Mechanical Approach for a Neural Language Model

被引:0
|
作者
Zhihao Wang
Min Ren
Xiaoyan Tian
Xia Liang
机构
[1] Shandong University of Finance and Economics,School of Management Science and Engineering
[2] Shandong University of Finance and Economics,School of Mathematics and Quantitative Economics
[3] Shandong Police College,undefined
关键词
Character-level; Quantum theory; Network-in-network; Language model;
D O I
暂无
中图分类号
学科分类号
摘要
This article proposes a character-level neural language model (NLM) that is based on quantum theory. The input of the model is the character-level coding represented by the quantum semantic space model. Our model integrates a convolutional neural network (CNN) that is based on network-in-network (NIN). We assessed the effectiveness of our model through extensive experiments based on the English-language Penn Treebank dataset. The experiments results confirm that the quantum semantic inputs work well for the language models. For example, the PPL of our model is 10%–30% less than the states of the arts, while it keeps the relatively smaller number of parameters (i.e., 6 m).
引用
收藏
页码:1613 / 1621
页数:8
相关论文
共 50 条
  • [1] Character-Level Quantum Mechanical Approach for a Neural Language Model
    Wang, Zhihao
    Ren, Min
    Tian, Xiaoyan
    Liang, Xia
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2019, 12 (02) : 1613 - 1621
  • [2] Correcting writing errors in Turkish with a character-level neural language model
    Benligiray, Burak
    [J]. 2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [3] Character-Level Neural Language Modelling in the Clinical Domain
    Kreuzthaler, Markus
    Oleynik, Michel
    Schulz, Stefan
    [J]. DIGITAL PERSONALIZED HEALTH AND MEDICINE, 2020, 270 : 83 - 87
  • [4] CHARACTER-LEVEL LANGUAGE MODELING WITH HIERARCHICAL RECURRENT NEURAL NETWORKS
    Hwang, Kyuyeon
    Sung, Wonyong
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5720 - 5724
  • [5] Character-level Language Modeling with Gated Hierarchical Recurrent Neural Networks
    Choi, Iksoo
    Park, Jinhwan
    Sung, Wonyong
    [J]. 19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 411 - 415
  • [6] A Character-Level Restoration of Sukhothai Inscriptions Using The Masked Language Model
    Tongkhum, Sujitra
    Sinthupinyo, Sukree
    [J]. 2023 18TH INTERNATIONAL JOINT SYMPOSIUM ON ARTIFICIAL INTELLIGENCE AND NATURAL LANGUAGE PROCESSING, ISAI-NLP, 2023,
  • [7] Character Eyes: Seeing Language through Character-Level Taggers
    Pinter, Yuval
    Marone, Marc
    Eisenstein, Jacob
    [J]. BLACKBOXNLP WORKSHOP ON ANALYZING AND INTERPRETING NEURAL NETWORKS FOR NLP AT ACL 2019, 2019, : 95 - 102
  • [8] Neural Character-Level Syntactic Parsing for Chinese
    Li, Zuchao
    Zhou, Junru
    Zhao, Hai
    Zhang, Zhisong
    Li, Haonan
    Ju, Yuqi
    [J]. JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2022, 73 : 461 - 509
  • [9] Neural Character-Level Syntactic Parsing for Chinese
    Li, Zuchao
    Zhou, Junru
    Zhao, Hai
    Zhang, Zhisong
    Li, Haonan
    Ju, Yuqi
    [J]. Journal of Artificial Intelligence Research, 2022, 73 : 461 - 509
  • [10] Neural Character-Level Dependency Parsing for Chinese
    Li, Haonan
    Zhang, Zhisong
    Ju, Yuqi
    Zhao, Hai
    [J]. THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 5205 - 5212