共 50 条
- [41] The Case for Translation-Invariant Self-Attention in Transformer-Based Language Models [J]. ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 130 - 140
- [42] Lane Detection Method Based on Improved Multi-Head Self-Attention [J]. Computer Engineering and Applications, 60 (02): : 264 - 271
- [45] Solar irradiance prediction based on self-attention recursive model network [J]. Frontiers in Energy Research, 2022, 10
- [48] An Improved Hierarchical Phrase Based Machine Translation Model [J]. 2011 AASRI CONFERENCE ON APPLIED INFORMATION TECHNOLOGY (AASRI-AIT 2011), VOL 1, 2011, : 239 - 242
- [49] Cascade Prediction model based on Dynamic Graph Representation and Self-Attention [J]. Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2022, 51 (01): : 83 - 90
- [50] Deep Learning-Based Identification of Maize Leaf Diseases Is Improved by an Attention Mechanism: Self-Attention [J]. FRONTIERS IN PLANT SCIENCE, 2022, 13