How can SHAP (SHapley Additive exPlanations) interpretations improve deep learning based urban cellular automata model?

被引:2
|
作者
Yang, Changlan [1 ]
Guan, Xuefeng [2 ,3 ]
Xu, Qingyang [2 ]
Xing, Weiran [2 ]
Chen, Xiaoyu [2 ]
Chen, Jinguo [4 ]
Jia, Peng [1 ,3 ,5 ,6 ]
机构
[1] Wuhan Univ, Sch Resource & Environm Sci, Wuhan 430079, Peoples R China
[2] Wuhan Univ, State Key Lab Informat Engn Surveying Mapping & Re, Wuhan 430079, Peoples R China
[3] Hubei Luojia Lab, Wuhan 430079, Peoples R China
[4] Third Team Hubei Geol Bur, Huanggang 438000, Peoples R China
[5] Wuhan Univ, Sch Publ Hlth, Wuhan 430079, Peoples R China
[6] Wuhan Univ, Int Inst Spatial Lifecourse Hlth ISLE, Wuhan 430079, Peoples R China
基金
中国国家自然科学基金;
关键词
Model interpretability; SHapley Additive exPlanations; Feature selection; Cellular automata; Convolution neural network; Urban growth simulation; LAND-USE-CHANGE; SENSITIVITY-ANALYSIS; EXPANSION;
D O I
10.1016/j.compenvurbsys.2024.102133
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Interpretations of the urban cellular automata (CA) model aim to ensure that its predictive behaviors are consistent with real -world processes. Current urban CA interpretations have revealed the impacts of driving factors on land development suitability, or neighborhood effects and random perturbation on simulation results. However, three limitations remain unresolved: (1) the interpretations of deep learning (DL) -based urban CA are seldom integrated with the prerequired feature selection, (2) the input features from different urban CA modules are still explained by separate approaches, and (3) the interpretation results are rarely derived at the cell level to uncover spatially varying urban land development patterns. This study proposes a SHapley Additive exPlanations (SHAP)-based urban CA interpretation framework to address these challenges and improve urban CA. This framework uses model-level SHAP importance to identify dominant features from different modules for constructing the final simulation model. Then, cell-level SHAP importance is used to uncover spatially varying driving forces of urban expansion. The framework's effectiveness is rigorously tested and confirmed using a convolution neural network CA (CNN -CA) model for Dongguan City. The experimental results demonstrate that (1) SHAP-based model interpretation improves feature selection for DL -based urban CA. The figure of merit for CNN -CA calibrated using SHAP-based important features improves by 3%, outperforming the tested baseline methods. (2) SHAP measures the impacts of each feature from different CA modules in a whole. In this case, physical factors are much more important at the model level than proximity and accessibility factors, while neighborhood effect is the second most crucial factor. (3) Cell-level SHAP interpretations uncover spatially different urban land development patterns. For example, due to the extensive industrial land development in the northern Songshan Lake Zone, in the CNN -CA model, proximity to major roads within this region is associated with positive SHAP-based contribution share on cell-level urban expansion.
引用
收藏
页数:16
相关论文
共 24 条
  • [1] An explainable predictive model for suicide attempt risk using an ensemble learning and Shapley Additive Explanations (SHAP) approach
    Nordin, Noratikah
    Zainol, Zurinahni
    Noor, Mohd Halim Mohd
    Chan, Lai Fong
    [J]. ASIAN JOURNAL OF PSYCHIATRY, 2023, 79
  • [2] Failure mode and effects analysis of RC members based on machine-learning-based SHapley Additive exPlanations (SHAP) approach
    Mangalathu, Sujith
    Hwang, Seong-Hoon
    Jeon, Jong-Su
    [J]. ENGINEERING STRUCTURES, 2020, 219
  • [3] Interpretable prediction of thermal sensation for elderly people based on data sampling, machine learning and SHapley Additive exPlanations (SHAP)
    Zheng, Guozhong
    Zhang, Yuqin
    Yue, Xuhui
    Li, Kang
    [J]. BUILDING AND ENVIRONMENT, 2023, 242
  • [4] Analyzing deep reinforcement learning model decisions with Shapley additive explanations for counter drone operations
    Cetin, Ender
    Barrado, Cristina
    Salami, Esther
    Pastor, Enric
    [J]. APPLIED INTELLIGENCE, 2024, 54 (23) : 12095 - 12111
  • [5] Fatigue life analysis of high-strength bolts based on machine learning method and SHapley Additive exPlanations (SHAP) approach
    Zhang, Shujia
    Lei, Honggang
    Zhou, Zichun
    Wang, Guoqing
    Qiu, Bin
    [J]. STRUCTURES, 2023, 51 : 275 - 287
  • [6] Explaining deep learning-based activity schedule models using SHapley Additive exPlanations
    Koushik, Anil
    Manoj, M.
    Nezamuddin, N.
    [J]. TRANSPORTATION LETTERS-THE INTERNATIONAL JOURNAL OF TRANSPORTATION RESEARCH, 2024,
  • [7] Evaluating the relevance of eggshell and glass powder for cement-based materials using machine learning and SHapley Additive exPlanations (SHAP) analysis
    Amin, Muhammad Nasir
    Ahmad, Waqas
    Khan, Kaffayatullah
    Nazar, Sohaib
    Abu Arab, Abdullah Mohammad
    Deifalla, Ahmed Farouk
    [J]. CASE STUDIES IN CONSTRUCTION MATERIALS, 2023, 19
  • [8] Prediction model for the compressive strength of rock based on stacking ensemble learning and shapley additive explanations
    Wu, Luyuan
    Li, Jianhui
    Zhang, Jianwei
    Wang, Zifa
    Tong, Jingbo
    Ding, Fei
    Li, Meng
    Feng, Yi
    Li, Hui
    [J]. Bulletin of Engineering Geology and the Environment, 2024, 83 (11)
  • [9] A novel framework for lung cancer classification using lightweight convolutional neural networks and ridge extreme learning machine model with SHapley Additive exPlanations (SHAP)
    Nahiduzzaman, Md.
    Abdulrazak, Lway Faisal
    Ayari, Mohamed Arselene
    Khandakar, Amith
    Islam, S. M. Riazul
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 248
  • [10] Assessment of a New Solar Radiation Nowcasting Method Based on FY-4A Satellite Imagery, the McClear Model and SHapley Additive exPlanations (SHAP)
    Jia, Dongyu
    Yang, Liwei
    Gao, Xiaoqing
    Li, Kaiming
    [J]. REMOTE SENSING, 2023, 15 (09)