SoFTNet: A concept-controlled deep learning architecture for interpretable image classification

被引:8
|
作者
Zia, Tehseen [1 ]
Bashir, Nauman [1 ]
Ullah, Mirza Ahsan [1 ]
Murtaza, Shakeeb [1 ]
机构
[1] COMSATS Univ Islamabad, Islamabad, Pakistan
关键词
Interpretability; Concepts; KNN; Explanation satisfaction;
D O I
10.1016/j.knosys.2021.108066
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Interpreting deep learning (DL)-based computer vision models is challenging due to the complexity of internal representations. Most recent techniques for rendering DL learning outcomes interpretable operate on low-level features rather than high-level concepts. Methods that explicitly incorporate high-level concepts do so through a determination of the relevancy of user-defined concepts or else concepts extracted directly from the data. However, they do not leverage the potential of concepts to explain model predictions. To overcome this challenge, we introduce a novel DL architecture - the Slow/Fast Thinking Network (SoFTNet) - enabling users to define/control high-level features and utilize them to perform image classification predicatively. We draw inspiration from the dual-process theory of human thought processes, decoupling low-level, fast & non-transparent processing from high-level, slow & transparent processing. SoFTNet hence uses a shallow convolutional neural network for low-level processing in conjunction with a memory network for high-level concept-based reasoning. We conduct experiments on the CUB-200-2011 and STL-10 datasets and also present a novel concept-based deep K-nearest neighbor approach for baseline comparisons. Our experiments show that SoFTNet achieves comparable performance to state-of-art non-interpretable models and outperforms comparable interpretative methods. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] PARTICLE SWARM OPTIMIZATION BASED DEEP LEARNING ARCHITECTURE SEARCH FOR HYPERSPECTRAL IMAGE CLASSIFICATION
    Zhang, Chaochao
    Liu, Xiaobo
    Wang, Guangjun
    Cai, Zhihua
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 509 - 512
  • [32] Deep Learning Approach for Image Classification
    Panigrahi, Santisudha
    Nanda, Anuja
    Swamkar, Tripti
    2ND INTERNATIONAL CONFERENCE ON DATA SCIENCE AND BUSINESS ANALYTICS (ICDSBA 2018), 2018, : 511 - 516
  • [33] Satellite Image Classification with Deep Learning
    Pritt, Mark
    Chern, Gary
    2017 IEEE APPLIED IMAGERY PATTERN RECOGNITION WORKSHOP (AIPR), 2017,
  • [34] Deep learning for biological image classification
    Affonso, Carlos
    Debiaso Rossi, Andre Luis
    Antunes Vieira, Fabio Henrique
    de Leon Ferreira de Carvalho, Andre Carlos Ponce
    EXPERT SYSTEMS WITH APPLICATIONS, 2017, 85 : 114 - 122
  • [35] Shallow and deep learning for image classification
    Ososkov G.
    Goncharov P.
    Optical Memory and Neural Networks, 2017, 26 (4) : 221 - 248
  • [36] Deep learning in tiny image classification
    Lv, Gang
    2012 INTERNATIONAL CONFERENCE ON INTELLIGENCE SCIENCE AND INFORMATION ENGINEERING, 2012, 20 : 5 - 8
  • [37] Deep Learning Model for Image Classification
    Tamuly, Sudarshana
    Jyotsna, C.
    Amudha, J.
    COMPUTATIONAL VISION AND BIO-INSPIRED COMPUTING, 2020, 1108 : 312 - 320
  • [38] Spectral Image Classification with Deep Learning
    Jankov, Viktor
    Prochaska, J. Xavier
    PUBLICATIONS OF THE ASTRONOMICAL SOCIETY OF THE PACIFIC, 2018, 130 (991)
  • [39] Language in a Bottle: Language Model Guided Concept Bottlenecks for Interpretable Image Classification
    Yang, Yue
    Panagopoulou, Artemis
    Zhou, Shenghao
    Jin, Daniel
    Callison-Burch, Chris
    Yatskar, Mark
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 19187 - 19197
  • [40] Deep Learning for Satellite Image Classification
    Shafaey, Mayar A.
    Salem, Mohammed A. -M.
    Ebied, H. M.
    Al-Berry, M. N.
    Tolba, M. F.
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT SYSTEMS AND INFORMATICS 2018, 2019, 845 : 383 - 391