Exploiting Pre-Trained Language Models for Black-Box Attack against Knowledge Graph Embeddings

被引:0
|
作者
Yang, Guangqian [1 ]
Zhang, Lei [1 ]
Liu, Yi [2 ]
Xie, Hongtao [1 ]
Mao, Zhendong [1 ]
机构
[1] University of Science and Technology of China, Hefei, China
[2] People's Daily Online, Beijing, China
关键词
D O I
10.1145/3688850
中图分类号
学科分类号
摘要
43
引用
收藏
相关论文
共 50 条
  • [1] Integrating Knowledge Graph Embeddings and Pre-trained Language Models in Hypercomplex Spaces
    Nayyeri, Mojtaba
    Wang, Zihao
    Akter, Mst. Mahfuja
    Alam, Mirza Mohtashim
    Rony, Md Rashad Al Hasan
    Lehmann, Jens
    Staab, Steffen
    [J]. SEMANTIC WEB, ISWC 2023, PART I, 2023, 14265 : 388 - 407
  • [2] PLMmark: A Secure and Robust Black-Box Watermarking Framework for Pre-trained Language Models
    Li, Peixuan
    Cheng, Pengzhou
    Li, Fangqi
    Du, Wei
    Zhao, Haodong
    Liu, Gongshen
    [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 12, 2023, : 14991 - 14999
  • [3] Robotic environmental state recognition with pre-trained vision-language models and black-box optimization
    Kawaharazuka, Kento
    Obinata, Yoshiki
    Kanazawa, Naoaki
    Okada, Kei
    Inaba, Masayuki
    [J]. ADVANCED ROBOTICS, 2024, : 1255 - 1264
  • [4] On the Sentence Embeddings from Pre-trained Language Models
    Li, Bohan
    Zhou, Hao
    He, Junxian
    Wang, Mingxuan
    Yang, Yiming
    Li, Lei
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 9119 - 9130
  • [5] Knowledge Inheritance for Pre-trained Language Models
    Qin, Yujia
    Lin, Yankai
    Yi, Jing
    Zhang, Jiajie
    Han, Xu
    Zhang, Zhengyan
    Su, Yusheng
    Liu, Zhiyuan
    Li, Peng
    Sun, Maosong
    Zhou, Jie
    [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 3921 - 3937
  • [6] Distilling Relation Embeddings from Pre-trained Language Models
    Ushio, Asahi
    Camacho-Collados, Jose
    Schockaert, Steven
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9044 - 9062
  • [7] SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models
    Wang, Liang
    Zhao, Wei
    Wei, Zhuoyu
    Liu, Jingming
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4281 - 4294
  • [8] Assisted Process Knowledge Graph Building Using Pre-trained Language Models
    Bellan, Patrizio
    Dragoni, Mauro
    Ghidini, Chiara
    [J]. AIXIA 2022 - ADVANCES IN ARTIFICIAL INTELLIGENCE, 2023, 13796 : 60 - 74
  • [9] Evaluating Embeddings from Pre-Trained Language Models and Knowledge Graphs for Educational Content Recommendation
    Li, Xiu
    Henriksson, Aron
    Duneld, Martin
    Nouri, Jalal
    Wu, Yongchao
    [J]. FUTURE INTERNET, 2024, 16 (01)
  • [10] Probing Pre-Trained Language Models for Disease Knowledge
    Alghanmi, Israa
    Espinosa-Anke, Luis
    Schockaert, Steven
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033