Classification of Traditional Chinese Medicine Cases based on Character-level Bert and Deep Learning

被引:0
|
作者
Song, Zihao [1 ]
Xie, Yonghong [1 ]
Huang, Wen [1 ]
Wang, Haoyu [1 ]
机构
[1] Univ Sci & Technol Beijing, Schoool Comp & Commun Engn, Beijing, Peoples R China
关键词
Text Classification; Medicial Cases; Bert; Convolutional Neural Network; Recurrent Neural Network;
D O I
10.1109/itaic.2019.8785612
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As one of the traditional cultures of our country, Traditional Chinese Medicine (TCM) has received more and more attention. As a valuable asset inherited from ancient times, TCM medical cases carry the core knowledge content of TCM. Accurate medical case classification is an important part of establishing a correct medical case diagnosis and treatment system, and is also an important part of medical assistance system. This paper proposes a new model to effectively classify medical cases. First, the multi-layer semantic expansion method is used to increase the semantic information of TCM medical cases in instance layer and attribute layer. Then, the character-level Bidirectional Encoder Representations from Transformers (Bert) model is used as a language model for text representation of the medical cases, and the results are as the input of the deep learning models. Finally, the optimized Text-Convolutional Neural Network (Text-CNN) model is used to classify TCM medical cases, and the reliability and accuracy of the whole model are verified through the comparison with the result of other text representation and classification methods.
引用
收藏
页码:1383 / 1387
页数:5
相关论文
共 50 条
  • [41] Hybrid Attention for Chinese Character-Level Neural Machine Translation
    Wang, Feng
    Chen, Wei
    Yang, Zhen
    Xu, Shuang
    Xu, Bo
    NEUROCOMPUTING, 2019, 358 : 44 - 52
  • [42] CharCaps: Character-Level Text Classification Using Capsule Networks
    Wu, Yujia
    Guo, Xin
    Zhan, Kangning
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, ICIC 2023, PT II, 2023, 14087 : 187 - 198
  • [43] Off-line Chinese writer identification based on character-level decision combination
    Deng, Wei
    Chen, Qinghu
    Yan, Yucheng
    Wan, Chunxiao
    2008 INTERNATIONAL SYMPOSIUM ON INFORMATION PROCESSING AND 2008 INTERNATIONAL PACIFIC WORKSHOP ON WEB MINING AND WEB-BASED APPLICATION, 2008, : 762 - 765
  • [44] An Improved Deep Learning Model: S-TextBLCNN for Traditional Chinese Medicine Formula Classification
    Cheng, Ning
    Chen, Yue
    Gao, Wanqing
    Liu, Jiajun
    Huang, Qunfu
    Yan, Cheng
    Huang, Xindi
    Ding, Changsong
    FRONTIERS IN GENETICS, 2021, 12
  • [45] Investigating an effective character-level embedding in Korean sentence classification
    Cho, Won Ik
    Kim, Seok Min
    Kim, Nam Soo
    arXiv, 2019,
  • [46] Improving Named Entity Recognition in Vietnamese Texts by a Character-Level Deep Lifelong Learning Model
    Ngoc-Vu Nguyen
    Thi-Lan Nguyen
    Cam-Van Nguyen Thi
    Mai-Vu Tran
    Tri-Thanh Nguyen
    Quang-Thuy Ha
    VIETNAM JOURNAL OF COMPUTER SCIENCE, 2019, 6 (04) : 471 - 487
  • [47] Crowdsourcing the character of a place: Character-level convolutional networks for multilingual geographic text classification
    Adams, Benjamin
    McKenzie, Grant
    TRANSACTIONS IN GIS, 2018, 22 (02) : 394 - 408
  • [48] Learning Character-level Representations for Part-of-Speech Tagging
    dos Santos, Cicero Nogueira
    Zadrozny, Bianca
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1818 - 1826
  • [49] A Character-Level Sequence-to-Sequence Method for Subtitle learning
    Zhang, Haijun
    Li, Jingxuan
    Ji, Yuzhu
    Yue, Heng
    2016 IEEE 14TH INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN), 2016, : 780 - 783
  • [50] Natural Language Enhancement for English Teaching Using Character-Level Recurrent Neural Network with Back Propagation Neural Network based Classification by Deep Learning Architectures
    Yang, Zhiling
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2022, 28 (08) : 984 - 1000