Character-Level Quantum Mechanical Approach for a Neural Language Model

被引:0
|
作者
Wang, Zhihao [1 ]
Ren, Min [2 ]
Tian, Xiaoyan [3 ]
Liang, Xia [1 ]
机构
[1] Shandong Univ Finance & Econ, Sch Management Sci & Engn, Jinan 250014, Peoples R China
[2] Shandong Univ Finance & Econ, Sch Math & Quantitat Econ, Jinan 250014, Peoples R China
[3] Shandong Police Coll, Jinan 250014, Peoples R China
基金
中国国家自然科学基金;
关键词
Character-level; Quantum theory; Network-in-network; Language model; SEMANTIC ANALYSIS; REPRESENTATIONS;
D O I
10.2991/ijcis.d.191114.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article proposes a character-level neural language model (NLM) that is based on quantum theory. The input of the model is the character-level coding represented by the quantum semantic space model. Our model integrates a convolutional neural network (CNN) that is based on network-in-network (NIN). We assessed the effectiveness of our model through extensive experiments based on the English-language Penn Treebank dataset. The experiments results confirm that the quantum semantic inputs work well for the language models. For example, the PPL of our model is 10%-30% less than the states of the arts, while it keeps the relatively smaller number of parameters (i.e., 6 m). (C) 2019 The Authors. Published by Atlantis Press SARL.
引用
收藏
页码:1613 / 1621
页数:9
相关论文
共 50 条
  • [41] Character-level neural network model based on Nadam optimization and its application in clinical concept extraction
    Li, Lantian
    Xu, Weizhi
    Yu, Hui
    NEUROCOMPUTING, 2020, 414 : 182 - 190
  • [42] Deep learning based Character-level approach for Morphological Inflection Generation
    Prasad, Vidya K.
    Premjith, B.
    Chandran, Chandni, V
    Soman, K. P.
    Poornachandran, Prabaharan
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING AND CONTROL SYSTEMS (ICCS), 2019, : 1423 - 1427
  • [43] Character-Level Chinese Dependency Parsing
    Zhang, Meishan
    Zhang, Yue
    Che, Wanxiang
    Liu, Ting
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 1326 - 1336
  • [44] Applying the Transformer to Character-level Transduction
    Wu, Shijie
    Cotterell, Ryan
    Hulden, Mans
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 1901 - 1907
  • [45] Quasi Character-Level Transformers to Improve Neural Machine Translation on Small Datasets
    Carrion, Salvador
    Casacuberta, Francisco
    2021 EIGHTH INTERNATIONAL CONFERENCE ON SOCIAL NETWORK ANALYSIS, MANAGEMENT AND SECURITY (SNAMS), 2021, : 184 - 189
  • [46] Character-level Adversarial Examples in Arabic
    Alshemali, Basemah
    Kalita, Jugal
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 9 - 14
  • [47] Character-Level Attention Convolutional Neural Networks for Short-Text Classification
    Yin, Feiyang
    Yao, Zhilin
    Liu, Jia
    HUMAN CENTERED COMPUTING, 2019, 11956 : 560 - 567
  • [48] Character-level recurrent neural networks in practice: comparing training and sampling schemes
    De Boom, Cedric
    Demeester, Thomas
    Dhoedt, Bart
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (08): : 4001 - 4017
  • [49] Enhanced character-level deep convolutional neural networks for cardiovascular disease prediction
    Zhang, Zhichang
    Qiu, Yanlong
    Yang, Xiaoli
    Zhang, Minyu
    BMC MEDICAL INFORMATICS AND DECISION MAKING, 2020, 20 (Suppl 3)
  • [50] Enhanced character-level deep convolutional neural networks for cardiovascular disease prediction
    Zhichang Zhang
    Yanlong Qiu
    Xiaoli Yang
    Minyu Zhang
    BMC Medical Informatics and Decision Making, 20