Using of Transformers Models for Text Classification to Mobile Educational Applications

被引:2
|
作者
Garrido, Anabel Pilicita [1 ]
Arias, Enrique Barra [1 ]
机构
[1] Univ Politecn Madrid, Madrid, Spain
关键词
Bit error rate; Transformers; Internet; Training; Text categorization; Recurrent neural networks; IEEE transactions; Natural Language Processing; Multiclass Text Classification; Bidirectional Encoder Representations from Transformers;
D O I
10.1109/TLA.2023.10172138
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In Q2 2022, educational apps were the second most popular category on the Google Play Store, accounting for 10.47% of the apps available worldwide. This work explores the application of five BERT-based pre-trained models with the Transformers architecture to classify mobile educational applications. These five models are according to the knowledge field: bert-base-cased, bert-base-uncased, roberta-base, albert-base-v2 and distilbert-base-uncased. This study uses a dataset with educational apps of Google Play, this dataset was enriched with description and category because it lacked this information. In all models, a tokenizer and fine-tuning works were applied for training in the classification task. After training the data, the testing phase was performed in which the models had to go through four training epochs to obtain better results: roberta-base with 81% accuracy, bert-base-uncased with 79% accuracy, bert-base-cased obtained 80% accuracy, albert-base-v2 obtained 78% accuracy and distilbert-base-uncased obtained 76% accuracy.
引用
收藏
页码:730 / 736
页数:7
相关论文
共 50 条
  • [1] Text Classification with Transformers and Reformers for Deep Text Data
    Soleymani, Roghayeh
    Farret, Jeremie
    [J]. PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN SIGNAL PROCESSING AND ARTIFICIAL INTELLIGENCE, ASPAI' 2020, 2020, : 239 - 243
  • [2] Limitations of Transformers on Clinical Text Classification
    Gao, Shang
    Alawad, Mohammed
    Young, M. Todd
    Gounley, John
    Schaefferkoetter, Noah
    Yoon, Hong Jun
    Wu, Xiao-Cheng
    Durbin, Eric B.
    Doherty, Jennifer
    Stroup, Antoinette
    Coyle, Linda
    Tourassi, Georgia
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (09) : 3596 - 3607
  • [3] Data Augmentation with Transformers for Text Classification
    Medardo Tapia-Tellez, Jose
    Jair Escalante, Hugo
    [J]. ADVANCES IN COMPUTATIONAL INTELLIGENCE, MICAI 2020, PT II, 2020, 12469 : 247 - 259
  • [4] A classification of mobile business models and its applications
    Leem, CS
    Suh, HS
    Kim, DS
    [J]. INDUSTRIAL MANAGEMENT & DATA SYSTEMS, 2004, 104 (1-2) : 78 - 87
  • [5] Improving text classification with transformers and layer normalization
    Rodrawangpai, Ben
    Daungjaiboon, Witawat
    [J]. MACHINE LEARNING WITH APPLICATIONS, 2022, 10
  • [6] Data Augmentation Using Transformers and Similarity Measures for Improving Arabic Text Classification
    Refai, Dania
    Abu-Soud, Saleh
    Abdel-Rahman, Mohammad J.
    [J]. IEEE ACCESS, 2023, 11 : 132516 - 132531
  • [7] Domain Text Classification Using Machine Learning Models
    Rao, Akula V. S. Siva Rama
    Bhavani, D. Ganga
    Krishna, J. Gopi
    Swapna, B.
    Varma, K. Rama Sai
    [J]. PROCEEDINGS OF SECOND INTERNATIONAL CONFERENCE ON SUSTAINABLE EXPERT SYSTEMS (ICSES 2021), 2022, 351 : 573 - 582
  • [8] Arabic text classification using deep learning models
    Elnagar, Ashraf
    Al-Debsi, Ridhwan
    Einea, Omar
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2020, 57 (01)
  • [9] State models meet transformers for classification
    Shi, Xuefei
    Zhang, Yisi
    Liu, Kecheng
    Wen, Zhaokun
    Wang, Wenxuan
    Zhang, Tianxiang
    Li, Jiangyun
    [J]. SIGNAL PROCESSING, 2025, 226
  • [10] Opinion mining using ensemble text hidden Markov models for text classification
    Kang, Mangi
    Ahn, Jaelim
    Lee, Kichun
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2018, 94 : 218 - 227