共 50 条
- [41] Latent Positional Information is in the Self-Attention Variance of Transformer Language ModelsWithout Positional Embeddings 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1183 - 1193
- [42] SparseBERT: Rethinking the Importance Analysis in Self-attention INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [44] SHYNESS AND SELF-ATTENTION BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
- [45] Multi-entity sentiment analysis using self-attention based hierarchical dilated convolutional neural network FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2020, 112 : 116 - 125
- [48] Attention and self-attention in random forests Progress in Artificial Intelligence, 2023, 12 : 257 - 273
- [49] Learning Contextual Features with Multi-head Self-attention for Fake News Detection COGNITIVE COMPUTING - ICCC 2019, 2019, 11518 : 132 - 142
- [50] Combining Contextual Information by Self-attention Mechanism in Convolutional Neural Networks for Text Classification WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 453 - 467