Research on Chinese Intent Recognition Based on BERT pre-trained model

被引:1
|
作者
Zhang, Pan [1 ]
Huang, Li [1 ]
机构
[1] Wuhan Univ Sci & Technol, Sch Comp Sci & Technol, Key Lab Intelligent Informat Proc & Real Time Ind, Wuhan, Hubei, Peoples R China
关键词
Intent recognition; BERT pre-trained model; Deep learning;
D O I
10.1145/3395260.3395274
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a sub-task in natural language understanding, intent recognition research plays an important role in it. The accuracy of intent recognition is directly related to the performance of semantic slot filling, the choice of data set, and the research that will affect subsequent dialogue systems. Considering the diversity in text representation, traditional machine learning has been unable to accurately understand the deep meaning of user texts. This paper uses a BERT pre-trained model in deep learning based on Chinese text knots, and then adds a linear classification to it. Using the downstream classification task to fine-tune the pre-trained model so that the entire model together maximizes the performance of the downstream task. This paper performs domain intent classification experiments on the Chinese text THUCNews dataset. Compared with recurrent neural network(RNN) and convolutional neural network(CNN) methods, this method can improve performance by 3 percentage points. Experimental results show that the BERT pre-trained model can provide better accuracy and recall of Chinese news text domain intent classification.
引用
收藏
页码:128 / 132
页数:5
相关论文
共 50 条
  • [41] Chinese cyber-violent Speech Detection and Analysis Based on Pre-trained Model
    Zhou, Sunrui
    [J]. 2024 5TH INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKS AND INTERNET OF THINGS, CNIOT 2024, 2024, : 443 - 447
  • [42] Pre-trained Language Models with Limited Data for Intent Classification
    Kasthuriarachchy, Buddhika
    Chetty, Madhu
    Karmakar, Gour
    Walls, Darren
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [43] Pre-Trained Joint Model for Intent Classification and Slot Filling with Semantic Feature Fusion
    Chen, Yan
    Luo, Zhenghang
    [J]. SENSORS, 2023, 23 (05)
  • [44] Pre-trained Model Based Feature Envy Detection
    Ma, Wenhao
    Yu, Yaoxiang
    Ruan, Xiaoming
    Cai, Bo
    [J]. 2023 IEEE/ACM 20TH INTERNATIONAL CONFERENCE ON MINING SOFTWARE REPOSITORIES, MSR, 2023, : 430 - 440
  • [45] Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models
    Lai, Yuxuan
    Liu, Yijia
    Feng, Yansong
    Huang, Songfang
    Zhao, Dongyan
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1716 - 1731
  • [46] Disambiguation of Chinese Polyphones in an End-to-End Framework with Semantic Features Extracted by Pre-trained BERT
    Dai, Dongyang
    Wu, Zhiyong
    Kang, Shiyin
    Wu, Xixin
    Jia, Jia
    Su, Dan
    Yu, Dong
    Meng, Helen
    [J]. INTERSPEECH 2019, 2019, : 2090 - 2094
  • [47] Automatic Speech Recognition Dataset Augmentation with Pre-Trained Model and Script
    Kwon, Minsu
    Choi, Ho-Jin
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2019, : 649 - 651
  • [48] Hockey activity recognition using pre-trained deep learning model
    Rangasamy, Keerthana
    As'ari, Muhammad Amir
    Rahmad, Nur Azmina
    Ghazali, Nurul Fathiah
    [J]. ICT EXPRESS, 2020, 6 (03): : 170 - 174
  • [49] Lawformer: A pre-trained language model for Chinese legal long documents
    Xiao, Chaojun
    Hu, Xueyu
    Liu, Zhiyuan
    Tu, Cunchao
    Sun, Maosong
    [J]. AI OPEN, 2021, 2 : 79 - 84
  • [50] MedBERT: A Pre-trained Language Model for Biomedical Named Entity Recognition
    Vasantharajan, Charangan
    Tun, Kyaw Zin
    Thi-Nga, Ho
    Jain, Sparsh
    Rong, Tong
    Siong, Chng Eng
    [J]. PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 1482 - 1488