共 50 条
- [41] Logical Transformers: Infusing Logical Structures into Pre-Trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 1762 - 1773
- [42] Finding and Editing Multi-Modal Neurons in Pre-Trained Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 1012 - 1037
- [44] PTW: Pivotal TuningWatermarking for Pre-Trained Image Generators PROCEEDINGS OF THE 32ND USENIX SECURITY SYMPOSIUM, 2023, : 2241 - 2258
- [45] Underwater Image Enhancement Using Pre-trained Transformer IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT III, 2022, 13233 : 480 - 488
- [47] Image Hashing by Pre-Trained Deep Neural Network 2022 ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING (CACML 2022), 2022, : 468 - 471
- [48] FEGAN: Flexible and Efficient Face Editing With Pre-Trained Generator IEEE ACCESS, 2020, 8 : 65340 - 65350
- [49] FEGAN: Flexible and Efficient Face Editing with Pre-Trained Generator IEEE Access, 2020, 8 : 65340 - 65350
- [50] ZeroI2V: Zero-Cost Adaptation of Pre-trained Transformers from Image to Video COMPUTER VISION - ECCV 2024, PT LXXXIII, 2025, 15141 : 425 - 443