共 50 条
- [1] Domain-Aware Self-Attention for Multi-Domain Neural Machine Translation [J]. INTERSPEECH 2021, 2021, : 2047 - 2051
- [2] Toward Dependency-Aware Live Virtual Machine Migration [J]. THIRD INTERNATIONAL WORKSHOP ON VIRTUALIZATION TECHNOLOGIES IN DISTRIBUTED COMPUTING (VTDC-09), 2009, : 59 - 66
- [3] Dependency-Aware Attention Model for Emotion Analysis for Online News [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT I, 2019, 11439 : 172 - 184
- [4] Multi-Granularity Self-Attention for Neural Machine Translation [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 887 - 897
- [6] Dependency-aware Form Understanding [J]. 2021 IEEE 32ND INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING (ISSRE 2021), 2021, : 139 - 149
- [7] Dependency-Aware Attention Control for Unconstrained Face Recognition with Image Sets [J]. COMPUTER VISION - ECCV 2018, PT XI, 2018, 11215 : 573 - 590
- [8] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4263 - 4272
- [9] Re-Transformer: A Self-Attention Based Model for Machine Translation [J]. AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 3 - 10