Adaptation of Transformer-Based Models for Depression Detection

被引:0
|
作者
Adebanji, Olaronke O. [1 ]
Ojo, Olumide E. [1 ]
Calvo, Hiram [1 ]
Gelbukh, Irina [1 ]
Sidorov, Grigori [1 ]
机构
[1] Inst Politecn Nacl, Ctr Invest Comp, Mexico City, Mexico
来源
COMPUTACION Y SISTEMAS | 2024年 / 28卷 / 01期
关键词
Depression; bag-of-words; word2vec; GloVe; machine learning; deep learning; transformers; sentiment analysis;
D O I
10.13053/CyS-28-1-4691
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Pre -trained language models are able to capture a broad range of knowledge and language patterns in text and can be fine-tuned for specific tasks. In this paper, we focus on evaluating the effectiveness of various traditional machine learning and pre -trained language models in identifying depression through the analysis of text from social media. We examined different feature representations with the traditional machine learning models and explored the impact of pre -training on the transformer models and compared their performance. Using BoW, Word2Vec, and GloVe representations, the machine learning models with which we experimented achieved impressive accuracies in the task of detecting depression. However, pre -trained language models exhibited outstanding performance, consistently achieving high accuracy, precision, recall, and F1 scores of approximately 0.98 or higher.
引用
收藏
页码:151 / 165
页数:15
相关论文
共 50 条
  • [1] Transformer-based models for multimodal irony detection
    Tomás D.
    Ortega-Bueno R.
    Zhang G.
    Rosso P.
    Schifanella R.
    [J]. Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (6) : 7399 - 7410
  • [2] Depression detection in social media posts using transformer-based models and auxiliary features
    Kerasiotis, Marios
    Ilias, Loukas
    Askounis, Dimitris
    [J]. SOCIAL NETWORK ANALYSIS AND MINING, 2024, 14 (01)
  • [3] Transformer-Based Language Models for Software Vulnerability Detection
    Thapa, Chandra
    Jang, Seung Ick
    Ahmed, Muhammad Ejaz
    Camtepe, Seyit
    Pieprzyk, Josef
    Nepal, Surya
    [J]. PROCEEDINGS OF THE 38TH ANNUAL COMPUTER SECURITY APPLICATIONS CONFERENCE, ACSAC 2022, 2022, : 481 - 496
  • [4] Localizing in-domain adaptation of transformer-based biomedical language models
    Buonocore, Tommaso Mario
    Crema, Claudio
    Redolfi, Alberto
    Bellazzi, Riccardo
    Parimbelli, Enea
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 144
  • [5] Calibration of Transformer-Based Models for Identifying Stress and Depression in Social Media
    Ilias, Loukas
    Mouzakitis, Spiros
    Askounis, Dimitris
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (02) : 1979 - 1990
  • [6] EEG Classification with Transformer-Based Models
    Sun, Jiayao
    Xie, Jin
    Zhou, Huihui
    [J]. 2021 IEEE 3RD GLOBAL CONFERENCE ON LIFE SCIENCES AND TECHNOLOGIES (IEEE LIFETECH 2021), 2021, : 92 - 93
  • [7] On the Use of Transformer-Based Models for Intent Detection Using Clustering Algorithms
    Moura, Andre
    Lima, Pedro
    Mendonca, Fabio
    Mostafa, Sheikh Shanawaz
    Morgado-Dias, Fernando
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (08):
  • [8] Unlocking Bias Detection: Leveraging Transformer-Based Models for Content Analysis
    Raza, Shaina
    Bamgbose, Oluwanifemi
    Chatrath, Veronica
    Ghuge, Shardule
    Sidyakin, Yan
    Muaad, Abdullah Yahya Mohammed
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, : 6422 - 6434
  • [9] Multimodal Depression Detection Using Task-oriented Transformer-based Embedding
    Rasipuram, Sowmya
    Bhat, Junaid Hamid
    Maitra, Anutosh
    Shaw, Bishal
    Saha, Sriparna
    [J]. 2022 27TH IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (IEEE ISCC 2022), 2022,
  • [10] Transformer-Based Approach to Melanoma Detection
    Cirrincione, Giansalvo
    Cannata, Sergio
    Cicceri, Giovanni
    Prinzi, Francesco
    Currieri, Tiziana
    Lovino, Marta
    Militello, Carmelo
    Pasero, Eros
    Vitabile, Salvatore
    [J]. SENSORS, 2023, 23 (12)