共 50 条
- [21] Subsampling of Frequent Words in Text for Pre-training a Vision-Language Model [J]. PROCEEDINGS OF THE 1ST WORKSHOP ON LARGE GENERATIVE MODELS MEET MULTIMODAL APPLICATIONS, LGM3A 2023, 2023, : 61 - 67
- [22] Framework for Sentiment Analysis of Arabic Text [J]. PROCEEDINGS OF THE 27TH ACM CONFERENCE ON HYPERTEXT AND SOCIAL MEDIA (HT'16), 2016, : 315 - 317
- [23] MPNet-GRUs: Sentiment Analysis With Masked and Permuted Pre-Training for Language Understanding and Gated Recurrent Units [J]. IEEE ACCESS, 2024, 12 : 74069 - 74080
- [24] How does the pre-training objective affect what large language models learn about linguistic properties? [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 131 - 147
- [25] Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 246 - 256
- [26] AraXLNet: pre-trained language model for sentiment analysis of Arabic [J]. Journal of Big Data, 9
- [28] Towards Adversarial Attack on Vision-Language Pre-training Models [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 5005 - 5013
- [29] Pre-training and Evaluating Transformer-based Language Models for Icelandic [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 7386 - 7391
- [30] Pre-training Universal Language Representation [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 5122 - 5133