共 50 条
- [32] Textual Pre-Trained Models for Age Screening Across Community Question-Answering [J]. IEEE ACCESS, 2024, 12 : 30030 - 30038
- [33] A Study of Pre-trained Language Models in Natural Language Processing [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [34] VQAttack: Transferable Adversarial Attacks on Visual Question Answering via Pre-trained Models [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 7, 2024, : 6755 - 6763
- [35] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 4487 - 4497
- [36] An Empirical Study on Hyperparameter Optimization for Fine-Tuning Pre-trained Language Models [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2286 - 2300
- [38] An Empirical Survey of the Effectiveness of Debiasing Techniques for Pre-trained Language Models [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1878 - 1898
- [40] NMT Enhancement based on Knowledge Graph Mining with Pre-trained Language Model [J]. 2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 185 - 189