Research and Exploration on Chinese Natural Language Processing in Era of Large Language Models

被引:0
|
作者
大模型时代下的汉语自然语言处理研究与探索
机构
[1] [1,Huang, Shiyang
[2] 1,2,Xi, Xuefeng
[3] 1,2,Cui, Zhiming
关键词
Graph neural networks;
D O I
10.3778/j.issn.1002-8331.2405-0348
中图分类号
学科分类号
摘要
Natural language processing is a key step in realizing human-computer interaction, and Chinese natural language processing (CNLP) is an important part of it. With the development of big model technology, CNLP has entered a new stage, and these Chinese big models have stronger generalization ability and faster task adaptability. However, compared to English big models, Chinese big models are still deficient in logical reasoning and text comprehension ability. The advantages of graph neural networks in specific CNLP tasks are introduced, and a survey on the development potential of quantum machine learning in CNLP is conducted. The basic principles and technical architectures of big models are summarized, the typical datasets and model evaluation indexes for big model evaluation tasks are organized in detail, and the effects of current mainstream big models in CNLP tasks are evaluated and compared. The current challenges of CNLP are analyzed, and the future research direction of CNLP task is outlooked, which is hoped to help solve the current challenges of CNLP, and provide some references for the proposal of new methods. © 2025 Journal of Computer Engineering and Applications Beijing Co., Ltd.; Science Press. All rights reserved.
引用
收藏
页码:80 / 97
相关论文
共 50 条
  • [1] Natural language processing in the era of large language models
    Zubiaga, Arkaitz
    [J]. FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2024, 6
  • [2] Robustness of GPT Large Language Models on Natural Language Processing Tasks
    Xuanting, Chen
    Junjie, Ye
    Can, Zu
    Nuo, Xu
    Tao, Gui
    Qi, Zhang
    [J]. Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (05): : 1128 - 1142
  • [3] Introduction to Chinese Natural Language Processing (Review of Introduction to Chinese Natural Language Processing)
    Jiang Song
    [J]. JOURNAL OF TECHNOLOGY AND CHINESE LANGUAGE TEACHING, 2010, 1 (01): : 94 - 98
  • [4] Grope research on pragmatics in Chinese natural language processing
    Li, Lei
    Zhou, Yanquan
    Zhong, Yixin
    [J]. Proceedings of 2006 International Conference on Artificial Intelligence: 50 YEARS' ACHIEVEMENTS, FUTURE DIRECTIONS AND SOCIAL IMPACTS, 2006, : 743 - 746
  • [5] BioInstruct: instruction tuning of large language models for biomedical natural language processing
    Tran, Hieu
    Yang, Zhichao
    Yao, Zonghai
    Yu, Hong
    [J]. JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2024, 31 (09) : 1821 - 1832
  • [6] Natural language processing in the web era
    Basili, Roberto
    Magnini, Bernardo
    [J]. INTELLIGENZA ARTIFICIALE, 2012, 6 (02) : 117 - 119
  • [7] Large Language Models are Not Models of Natural Language: They are Corpus Models
    Veres, Csaba
    [J]. IEEE ACCESS, 2022, 10 : 61970 - 61979
  • [8] ADVANCES IN CHINESE NATURAL LANGUAGE PROCESSING AND LANGUAGE RESOURCES
    Tao, Jianhua
    Zheng, Fang
    Li, Aijun
    Li, Ya
    [J]. ORIENTAL COCOSDA 2009 - INTERNATIONAL CONFERENCE ON SPEECH DATABASE AND ASSESSMENTS, 2009, : 13 - +
  • [9] Future Prospects of Large Language Models: Enabling Natural Language Processing in Educational Robotics
    Vinoth Kumar, S.
    Saroo Raj, R.B.
    Praveenchandar, J.
    Vidhya, S.
    Karthick, S.
    Madhubala, R.
    [J]. International Journal of Interactive Mobile Technologies, 2024, 18 (23) : 85 - 97
  • [10] Fine-tuning large neural language models for biomedical natural language processing
    Tinn, Robert
    Cheng, Hao
    Gu, Yu
    Usuyama, Naoto
    Liu, Xiaodong
    Naumann, Tristan
    Gao, Jianfeng
    Poon, Hoifung
    [J]. PATTERNS, 2023, 4 (04):