共 50 条
- [41] Finding and Editing Multi-Modal Neurons in Pre-Trained Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 1012 - 1037
- [43] Selecting Better Samples from Pre-trained LLMs: A Case Study on Question Generation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 12952 - 12965
- [46] Weakly Supervised Deep Learning for Arabic Tweet Sentiment Analysis on Education Reforms: Leveraging Pre-Trained Models and LLMs With Snorkel IEEE ACCESS, 2025, 13 : 30523 - 30542
- [47] Harnessing Pre-Trained Sentence Transformers for Offensive Language Detection in Indian Languages CEUR Workshop Proc., (427-434):
- [49] Detecting Propaganda Techniques in English News Articles using Pre-trained Transformers 2022 13TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION SYSTEMS (ICICS), 2022, : 301 - 308