An empirical study on the robustness of the segment anything model (SAM)

被引:0
|
作者
Wang, Yuqing [1 ,3 ]
Zhao, Yun [2 ]
Petzold, Linda [1 ]
机构
[1] Univ Calif Santa Barbara, Comp Sci Dept, Santa Barbara, CA USA
[2] Meta Platforms Inc, Sunnyvale, CA 94089 USA
[3] Univ Calif Santa Barbara, Santa Barbara, CA 93106 USA
关键词
Segment anything model; Model robustness; Prompting techniques;
D O I
10.1016/j.patcog.2024.110685
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Segment Anything Model (SAM) is a foundation model for general image segmentation. Although it exhibits impressive performance predominantly on natural images, understanding its robustness against various image perturbations and domains is critical for real -world applications where such challenges frequently arise. In this study we conduct a comprehensive robustness investigation of SAM under diverse real -world conditions. Our experiments encompass a wide range of image perturbations. Our experimental results demonstrate that SAM's performance generally declines under perturbed images, with varying degrees of vulnerability across different perturbations. By customizing prompting techniques and leveraging domain knowledge based on the unique characteristics of each dataset, the model's resilience to these perturbations can be enhanced, addressing dataset-specific challenges. This work sheds light on the limitations and strengths of SAM in realworld applications, promoting the development of more robust and versatile image segmentation solutions. Our code is available at https://github.com/EternityYW/SAM-Robustness/.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] SAM struggles in concealed scenes - empirical study on "Segment Anything"
    Ji, Ge-Peng
    Fan, Deng-Ping
    Xu, Peng
    Zhou, Bowen
    Cheng, Ming-Ming
    Van Gool, Luc
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (12)
  • [2] SAM struggles in concealed scenes—empirical study on “Segment Anything”
    Ge-Peng JI
    Deng-Ping FAN
    Peng XU
    Bowen ZHOU
    Ming-Ming CHENG
    Luc VAN GOOL
    Science China(Information Sciences), 2023, 66 (12) : 278 - 280
  • [3] Adapting Segment Anything Model (SAM) for Retinal OCT
    Fazekas, Botond
    Morano, Jose
    Lachinov, Dmitrii
    Aresta, Guilherme
    Bogunovic, Hrvoje
    OPHTHALMIC MEDICAL IMAGE ANALYSIS, OMIA 2023, 2023, 14096 : 92 - 101
  • [4] The ability of Segmenting Anything Model (SAM) to segment ultrasound images
    Chen, Fang
    Chen, Lingyu
    Han, Haojie
    Zhang, Sainan
    Zhang, Daoqiang
    Liao, Hongen
    BIOSCIENCE TRENDS, 2023, 17 (03) : 211 - 218
  • [5] The Segment Anything Model (SAM) for accelerating the smart farming revolution
    Carraro, Alberto
    Sozzi, Marco
    Marinello, Francesco
    SMART AGRICULTURAL TECHNOLOGY, 2023, 6
  • [6] Evaluating segment anything model (SAM) on MRI scans of brain tumors
    Luqman Ali
    Fady Alnajjar
    Muhammad Swavaf
    Omar Elharrouss
    Alaa Abd-alrazaq
    Rafat Damseh
    Scientific Reports, 14 (1)
  • [7] Knowledge Distillation with Segment Anything (SAM) Model for Planetary Geological Mapping
    Julka, Sahib
    Granitzer, Michael
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, LOD 2023, PT I, 2024, 14505 : 68 - 77
  • [8] Labeling Construction, Renovation, and Demolition Waste through Segment Anything Model (SAM)
    Panizza, Rafaela Orenga
    Allam, Amr S.
    Kasliwal, Aparimit
    Nik-Bakht, Mazdak
    CONSTRUCTION RESEARCH CONGRESS 2024: ADVANCED TECHNOLOGIES, AUTOMATION, AND COMPUTER APPLICATIONS IN CONSTRUCTION, 2024, : 279 - 288
  • [9] SAM-Path: A Segment Anything Model for Semantic Segmentation in Digital Pathology
    Zhang, Jingwei
    Ma, Ke
    Kapse, Saarthak
    Saltz, Joel
    Vakalopoulou, Maria
    Prasanna, Prateek
    Samaras, Dimitris
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023 WORKSHOPS, 2023, 14393 : 161 - 170