Text Classification with Transformers and Reformers for Deep Text Data

被引:0
|
作者
Soleymani, Roghayeh [1 ]
Farret, Jeremie [1 ]
机构
[1] Inmind Technol Inc, Montreal, PQ, Canada
关键词
Natural language processing; Text classification; Transformers; Reformers; Trax; Mind in a box;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state of the art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with three different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformer in terms of accuracy and training speed for text classification. However, Reformers allow to train bigger models which cause memory failure for Transformers.
引用
收藏
页码:239 / 243
页数:5
相关论文
共 50 条
  • [1] Data Augmentation with Transformers for Text Classification
    Medardo Tapia-Tellez, Jose
    Jair Escalante, Hugo
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, MICAI 2020, PT II, 2020, 12469 : 247 - 259
  • [2] Limitations of Transformers on Clinical Text Classification
    Gao, Shang
    Alawad, Mohammed
    Young, M. Todd
    Gounley, John
    Schaefferkoetter, Noah
    Yoon, Hong Jun
    Wu, Xiao-Cheng
    Durbin, Eric B.
    Doherty, Jennifer
    Stroup, Antoinette
    Coyle, Linda
    Tourassi, Georgia
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (09) : 3596 - 3607
  • [3] Improving text classification with transformers and layer normalization
    Rodrawangpai, Ben
    Daungjaiboon, Witawat
    MACHINE LEARNING WITH APPLICATIONS, 2022, 10
  • [4] Data Mining in Clinical Trial Text: Transformers for Classification and Question Answering Tasks
    Schmidt, Lena
    Weeds, Julie
    Higgins, Julian P. T.
    PROCEEDINGS OF THE 13TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES, VOL 5: HEALTHINF, 2020, : 83 - 94
  • [5] Data Augmentation Using Transformers and Similarity Measures for Improving Arabic Text Classification
    Refai, Dania
    Abu-Soud, Saleh
    Abdel-Rahman, Mohammad J.
    IEEE ACCESS, 2023, 11 : 132516 - 132531
  • [6] Deep Reinforcement Learning with Transformers for Text Adventure Games
    Xu, Yunqiu
    Chen, Ling
    Fang, Meng
    Wang, Yang
    Zhang, Chengqi
    2020 IEEE CONFERENCE ON GAMES (IEEE COG 2020), 2020, : 65 - 72
  • [7] Deep Text Classification Can be Fooled
    Liang, Bin
    Li, Hongcheng
    Su, Miaoqiang
    Bian, Pan
    Li, Xirong
    Shi, Wenchang
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 4208 - 4215
  • [8] Deep Active Learning for Text Classification
    An, Bang
    Wu, Wenjun
    Han, Huimin
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING (ICVISP 2018), 2018,
  • [9] SF-CNN: Deep Text Classification and Retrieval for Text Documents
    Sarasu, R.
    Thyagharajan, K. K.
    Shanker, N. R.
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 35 (02): : 1799 - 1813
  • [10] Data Augmentation With Semantic Enrichment for Deep Learning Invoice Text Classification
    Chi, Wei Wen
    Tang, Tiong Yew
    Salleh, Narishah Mohamed
    Mukred, Muaadh
    Alsalman, Hussain
    Zohaib, Muhammad
    IEEE ACCESS, 2024, 12 : 57326 - 57344