Research on Chinese Intent Recognition Based on BERT pre-trained model

被引:1
|
作者
Zhang, Pan [1 ]
Huang, Li [1 ]
机构
[1] Wuhan Univ Sci & Technol, Sch Comp Sci & Technol, Key Lab Intelligent Informat Proc & Real Time Ind, Wuhan, Hubei, Peoples R China
关键词
Intent recognition; BERT pre-trained model; Deep learning;
D O I
10.1145/3395260.3395274
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a sub-task in natural language understanding, intent recognition research plays an important role in it. The accuracy of intent recognition is directly related to the performance of semantic slot filling, the choice of data set, and the research that will affect subsequent dialogue systems. Considering the diversity in text representation, traditional machine learning has been unable to accurately understand the deep meaning of user texts. This paper uses a BERT pre-trained model in deep learning based on Chinese text knots, and then adds a linear classification to it. Using the downstream classification task to fine-tune the pre-trained model so that the entire model together maximizes the performance of the downstream task. This paper performs domain intent classification experiments on the Chinese text THUCNews dataset. Compared with recurrent neural network(RNN) and convolutional neural network(CNN) methods, this method can improve performance by 3 percentage points. Experimental results show that the BERT pre-trained model can provide better accuracy and recall of Chinese news text domain intent classification.
引用
收藏
页码:128 / 132
页数:5
相关论文
共 50 条
  • [1] Chinese Grammatical Correction Using BERT-based Pre-trained Model
    Wang, Hongfei
    Kurosawa, Michiki
    Katsumatat, Satoru
    Komachi, Mamoru
    [J]. 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 163 - 168
  • [2] Patent classification with pre-trained Bert model
    Kahraman, Selen Yuecesoy
    Durmusoglu, Alptekin
    Dereli, Tuerkay
    [J]. JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, 2024, 39 (04): : 2485 - 2496
  • [3] Predictive Recognition of DNA-binding Proteins Based on Pre-trained Language Model BERT
    Ma, Yue
    Pei, Yongzhen
    Li, Changguo
    [J]. JOURNAL OF BIOINFORMATICS AND COMPUTATIONAL BIOLOGY, 2023, 21 (06)
  • [4] miProBERT: identification of microRNA promoters based on the pre-trained model BERT
    Wang, Xin
    Gao, Xin
    Wang, Guohua
    Li, Dan
    [J]. BRIEFINGS IN BIOINFORMATICS, 2023, 24 (03)
  • [5] CANCN-BERT: A Joint Pre-Trained Language Model for Classical and Modern Chinese
    Ji, Zijing
    Wang, Xin
    Shen, Yuxin
    Rao, Guozheng
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3112 - 3116
  • [6] Using Pre-trained Deeply Contextual Model BERT for Russian Named Entity Recognition
    Mukhin, Eugeny
    [J]. ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS (AIST 2019), 2020, 1086 : 167 - 173
  • [7] BERT-siRNA: siRNA target prediction based on BERT pre-trained interpretable model
    Xu, Jiayu
    Xu, Nan
    Xie, Weixin
    Zhao, Chengkui
    Yu, Lei
    Feng, Weixing
    [J]. GENE, 2024, 910
  • [8] A Hybrid Neural Network BERT-Cap Based on Pre-Trained Language Model and Capsule Network for User Intent Classification
    Liu, Hai
    Liu, Yuanxia
    Wong, Leung-Pun
    Lee, Lap-Kei
    Hao, Tianyong
    [J]. COMPLEXITY, 2020, 2020
  • [9] Entity Recognition for Chinese Hazardous Chemical Accident Data Based on Rules and a Pre-Trained Model
    Dai, Hui
    Zhu, Mu
    Yuan, Guan
    Niu, Yaowei
    Shi, Hongxing
    Chen, Boxuan
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [10] Leveraging Pre-trained BERT for Audio Captioning
    Liu, Xubo
    Mei, Xinhao
    Huang, Qiushi
    Sun, Jianyuan
    Zhao, Jinzheng
    Liu, Haohe
    Plumbley, Mark D.
    Kilic, Volkan
    Wang, Wenwu
    [J]. 2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1145 - 1149