A CMA-ES-Based Adversarial Attack Against Black-Box Object Detectors

被引:0
|
作者
LYU Haoran [1 ]
TAN Yu'an [1 ]
XUE Yuan [2 ]
WANG Yajie [1 ]
XUE Jingfeng [1 ]
机构
[1] School of Computer Science and Technology, Beijing Institute of Technology
[2] Academy of Military Science
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP309 [安全保密]; TP391.41 [];
学科分类号
080203 ; 081201 ; 0839 ; 1402 ;
摘要
Object detection is one of the essential tasks of computer vision. Object detectors based on the deep neural network have been used more and more widely in safe-sensitive applications, like face recognition, video surveillance, autonomous driving, and other tasks. It has been proved that object detectors are vulnerable to adversarial attacks. We propose a novel black-box attack method, which can successfully attack regression-based and region-based object detectors. We introduce methods to reduce search dimensions, reduce the dimension of optimization problems and reduce the number of queries by using the Covariance matrix adaptation Evolution strategy(CMA-ES) as the primary method to generate adversarial examples. Our method only adds adversarial perturbations in the object box to achieve a precise attack.Our proposed attack can hide the specified object with an attack success rate of 86% and an average number of queries of 5, 124, and hide all objects with a success rate of74% and an average number of queries of 6, 154. Our work illustrates the effectiveness of the CMA-ES method to generate adversarial examples and proves the vulnerability of the object detectors against the adversarial attacks.
引用
收藏
页码:406 / 412
页数:7
相关论文
共 50 条
  • [1] A CMA-ES-Based Adversarial Attack Against Black-Box Object Detectors
    Lyu Haoran
    Tan Yu'an
    Xue Yuan
    Wang Yajie
    Xue Jingfeng
    CHINESE JOURNAL OF ELECTRONICS, 2021, 30 (03) : 406 - 412
  • [2] A CMA-ES-Based Adversarial Attack on Black-Box Deep Neural Networks
    Kuang, Xiaohui
    Liu, Hongyi
    Wang, Ye
    Zhang, Qikun
    Zhang, Quanxin
    Zheng, Jun
    IEEE ACCESS, 2019, 7 : 172938 - 172947
  • [3] An adversarial attack on DNN-based black-box object detectors
    Wang, Yajie
    Tan, Yu-an
    Zhang, Wenjiao
    Zhao, Yuhang
    Kuang, Xiaohui
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2020, 161
  • [4] Object-Aware Transfer-Based Black-Box Adversarial Attack on Object Detector
    Leng, Zhuo
    Cheng, Zesen
    Wei, Pengxu
    Chen, Jie
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT XII, 2024, 14436 : 278 - 289
  • [5] A General Black-box Adversarial Attack on Graph-based Fake News Detectors
    School of Artificial Intelligence, Optics and Electronics, Northwestern Polytechnical University, China
    不详
    不详
    不详
    arXiv,
  • [6] SIMULATOR ATTACK plus FOR BLACK-BOX ADVERSARIAL ATTACK
    Ji, Yimu
    Ding, Jianyu
    Chen, Zhiyu
    Wu, Fei
    Zhang, Chi
    Sun, Yiming
    Sun, Jing
    Liu, Shangdong
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 636 - 640
  • [7] Reinforcement learning based adversarial malware example generation against black-box detectors
    Zhong, Fangtian
    Hu, Pengfei
    Zhang, Guoming
    Li, Hong
    Cheng, Xiuzhen
    COMPUTERS & SECURITY, 2022, 121
  • [8] Restricted Black-Box Adversarial Attack Against DeepFake Face Swapping
    Dong, Junhao
    Wang, Yuan
    Lai, Jianhuang
    Xie, Xiaohua
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2023, 18 : 2596 - 2608
  • [9] Black-box adversarial patch attacks using differential evolution against aerial imagery object detectors
    Tang, Guijian
    Yao, Wen
    Li, Chao
    Jiang, Tingsong
    Yang, Shaowu
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 137
  • [10] Dual stage black-box adversarial attack against vision transformer
    Wang, Fan
    Shao, Mingwen
    Meng, Lingzhuang
    Liu, Fukang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (08) : 3367 - 3378