共 50 条
- [1] Multilingual Constituency Parsing with Self-Attention and Pre-Training [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3499 - 3505
- [3] MolXPT: Wrapping Molecules with Text for Generative Pre-training [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1606 - 1616
- [4] A Text Sentiment Analysis Model Based on Self-Attention Mechanism [J]. 2019 THE 3RD INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPILATION, COMPUTING AND COMMUNICATIONS (HP3C 2019), 2019, : 33 - 37
- [5] A Self-attention Based Model for Offline Handwritten Text Recognition [J]. PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 356 - 369
- [6] Probing Inter-modality: Visual Parsing with Self-Attention for Vision-Language Pre-training [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
- [7] POINTER: Constrained Progressive Text Generation via Insertion-based Generative Pre-training [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8649 - 8670
- [9] Self-attention based Text Knowledge Mining for Text Detection [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 5979 - 5988
- [10] Cross-Modal Self-Attention with Multi-Task Pre-Training for Medical Visual Question Answering [J]. PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL (ICMR '21), 2021, : 456 - 460