共 50 条
- [1] PhoBERT: Pre-trained language models for Vietnamese [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1037 - 1042
- [3] Neural Transfer Learning For Vietnamese Sentiment Analysis Using Pre-trained Contextual Language Models [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLIED NETWORK TECHNOLOGIES (ICMLANT II), 2021, : 84 - 88
- [4] ViHealthBERT: Pre-trained Language Models for Vietnamese in Health Text Mining [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 328 - 337
- [5] Enhancing Turkish Sentiment Analysis Using Pre-Trained Language Models [J]. 29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
- [6] A Comparative Study of Using Pre-trained Language Models for Toxic Comment Classification [J]. WEB CONFERENCE 2021: COMPANION OF THE WORLD WIDE WEB CONFERENCE (WWW 2021), 2021, : 500 - 507
- [7] Pre-trained Language Models with Limited Data for Intent Classification [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
- [9] Issue Report Classification Using Pre-trained Language Models [J]. 2022 IEEE/ACM 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE-BASED SOFTWARE ENGINEERING (NLBSE 2022), 2022, : 29 - 32
- [10] Error Investigation of Pre-trained BERTology Models on Vietnamese Natural Language Inference [J]. RECENT CHALLENGES IN INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, 2022, 1716 : 176 - 188