共 50 条
- [1] Multi-hop Attention Graph Neural Networks [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 3089 - 3096
- [2] Towards Understanding Neural Machine Translation with Attention Heads' Importance [J]. APPLIED SCIENCES-BASEL, 2024, 14 (07):
- [3] Explainable Neural Subgraph Matching With Learnable Multi-Hop Attention [J]. IEEE ACCESS, 2024, 12 : 130474 - 130492
- [4] Losing Heads in the Lottery: Pruning Transformer Attention in Neural Machine Translation [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 2664 - 2674
- [6] Attention-via-Attention Neural Machine Translation [J]. THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 563 - 570
- [7] Multi-hop Attention GNN with Answer-Evidence Contrastive Loss for Multi-hop QA [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
- [8] Ruminating Reader: Reasoning with Gated Multi-Hop Attention [J]. MACHINE READING FOR QUESTION ANSWERING, 2018, : 1 - 11
- [9] Recurrent Attention for Neural Machine Translation [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3216 - 3225