Text Classification with Transformers and Reformers for Deep Text Data

被引:0
|
作者
Soleymani, Roghayeh [1 ]
Farret, Jeremie [1 ]
机构
[1] Inmind Technol Inc, Montreal, PQ, Canada
关键词
Natural language processing; Text classification; Transformers; Reformers; Trax; Mind in a box;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state of the art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with three different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformer in terms of accuracy and training speed for text classification. However, Reformers allow to train bigger models which cause memory failure for Transformers.
引用
收藏
页码:239 / 243
页数:5
相关论文
共 50 条
  • [41] Text Data Augmentation for Deep Learning
    Connor Shorten
    Taghi M. Khoshgoftaar
    Borko Furht
    Journal of Big Data, 8
  • [42] Deep text classification of Instagram data using word embeddings and weak supervision
    Hammar, Kim
    Jaradat, Shatha
    Dokoohaki, Nima
    Matskin, Mihhail
    WEB INTELLIGENCE, 2020, 18 (01) : 53 - 67
  • [43] Transformers for Multi-label Classification of Medical Text: An Empirical Comparison
    Yogarajan, Vithya
    Montiel, Jacob
    Smith, Tony
    Pfahringer, Bernhard
    ARTIFICIAL INTELLIGENCE IN MEDICINE (AIME 2021), 2021, : 114 - 123
  • [44] Classification of Medical Sensitive Data based on Text Classification
    Jiang, Huimin
    Chen, Chunling
    Wu, ShengChen
    Guo, Yongan
    2019 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TW), 2019,
  • [45] Taming Pretrained Transformers for Extreme Multi-label Text Classification
    Chang, Wei-Cheng
    Yu, Hsiang-Fu
    Zhong, Kai
    Yang, Yiming
    Dhillon, Inderjit S.
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 3163 - 3171
  • [46] Evidential Robust Deep Learning for Noisy Text2text Question Classification
    Wang, Haoran
    Wang, Jiyao
    Chen, Yuqiu
    Peng, Zehua
    Zhang, Zuping
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 211 - 222
  • [47] Text Smoothing: Enhance Various Data Augmentation Methods on Text Classification Tasks
    Wu, Xing
    Gao, Chaochen
    Lin, Meng
    Zang, Liangjun
    Hu, Songlin
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 871 - 875
  • [48] Text Generation for Imbalanced Text Classification
    Akkaradamrongrat, Suphamongkol
    Kachamas, Pornpimon
    Sinthupinyo, Sukree
    2019 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING (JCSSE 2019), 2019, : 181 - 186
  • [49] Text Summarization Based on Conceptual Data Classification
    AlJa'am, Jihad M.
    Jaoua, Ali M.
    Hasnah, Ahmad M.
    Hassan, F.
    Mohamed, H.
    Mosaid, T.
    Saleh, H.
    Abdullah, F.
    Cherif, H.
    INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY AND WEB ENGINEERING, 2006, 1 (04) : 22 - 36
  • [50] Combining Embeddings of Input Data for Text Classification
    Zuzanna Parcheta
    Germán Sanchis-Trilles
    Francisco Casacuberta
    Robin Rendahl
    Neural Processing Letters, 2021, 53 : 3123 - 3151