Comparison of general kernel, multiple kernel, infinite ensemble and semi-supervised support vector machines for landslide susceptibility prediction

被引:14
|
作者
Fang, Zhice [1 ]
Wang, Yi [1 ]
Duan, Hexiang [1 ]
Niu, Ruiqing [1 ]
Peng, Ling [2 ]
机构
[1] China Univ Geosci, Inst Geophys & Geomat, Wuhan 430074, Peoples R China
[2] China Inst Geoenvironm Monitoring, Beijing 100081, Peoples R China
基金
中国国家自然科学基金;
关键词
Landslide susceptibility mapping; Support vector machine; Multiple kernel learning; Semi-supervised method; Infinite ensemble learning; RAINFALL-INDUCED LANDSLIDES; LOGISTIC-REGRESSION LR; SPATIAL PREDICTION; FREQUENCY RATIO; NEURAL-NETWORKS; RIVER-BASIN; MODELS; AREA; OPTIMIZATION; CLASSIFIER;
D O I
10.1007/s00477-022-02208-z
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Landslide susceptibility prediction is a key step in preventing and managing landslide hazards. As a classical supervised non-parametric machine learning model, support vector machine (SVM) has been widely used in landslide susceptibility prediction in recent years. However, most studies focus on the application of general SVM methods, or compare SVMs as benchmark methods. SVMs with different kernel functions are rarely used in this field. In this study, we apply the general SVM and its popular variants (i.e., multiple kernel learning, infinite ensemble SVM and semi-supervised SVM) to predict landslide susceptibility, and compare their prediction performance. The experimental results show that the Laplacian-SVM has the highest prediction performance (AUC = 0.8815) among SVM-based methods. SVMs with RBF kernel can achieve higher performance than SVMs with linear kernel, indicating that RBF kernel is more suitable for solving susceptibility prediction problems. Furthermore, SVM-based methods have higher sensitivity (0.8543-0.9288) than deep learning methods (0.8237-0.8271), which proves the advantage of SVMs in finding potential landslide areas.
引用
收藏
页码:3535 / 3556
页数:22
相关论文
共 50 条
  • [1] Comparison of general kernel, multiple kernel, infinite ensemble and semi-supervised support vector machines for landslide susceptibility prediction
    Zhice Fang
    Yi Wang
    Hexiang Duan
    Ruiqing Niu
    Ling Peng
    [J]. Stochastic Environmental Research and Risk Assessment, 2022, 36 : 3535 - 3556
  • [2] The responsibility weighted Mahalanobis kernel for semi-supervised training of support vector machines for classification
    Reitmaier, Tobias
    Sick, Bernhard
    [J]. INFORMATION SCIENCES, 2015, 323 : 179 - 198
  • [3] Semi-supervised support vector machines
    Bennett, KP
    Demiriz, A
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 11, 1999, 11 : 368 - 374
  • [4] Robust semi-supervised support vector machines with Laplace kernel-induced correntropy loss functions
    Dong, Hongwei
    Yang, Liming
    Wang, Xue
    [J]. APPLIED INTELLIGENCE, 2021, 51 (02) : 819 - 833
  • [5] Robust semi-supervised support vector machines with Laplace kernel-induced correntropy loss functions
    Hongwei Dong
    Liming Yang
    Xue Wang
    [J]. Applied Intelligence, 2021, 51 : 819 - 833
  • [6] Semi-Supervised Multiclass Kernel Machines with Probabilistic Constraints
    Melacci, Stefano
    Cori, Marco
    [J]. AI(STAR)IA 2011: ARTIFICIAL INTELLIGENCE AROUND MAN AND BEYOND, 2011, 6934 : 21 - 32
  • [7] Semi-supervised Support Vector Machines Regression
    Zhu, Dingzhen
    Wang, Xin
    Chen, Heng
    Wu, Rui
    [J]. PROCEEDINGS OF THE 2014 9TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2014, : 2015 - +
  • [8] Distributed semi-supervised support vector machines
    Scardapane, Simone
    Fierimonte, Roberto
    Di Lorenzo, Paolo
    Panella, Massimo
    Uncini, Aurelio
    [J]. NEURAL NETWORKS, 2016, 80 : 43 - 52
  • [9] Laplacian Embedded Infinite Kernel Model for Semi-Supervised Classification
    Yang, Tao
    Fu, Dongmei
    Wu, Chunhong
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2016, 30 (10)
  • [10] Semi-supervised classification with Laplacian multiple kernel learning
    Yang, Tao
    Fu, Dongmei
    [J]. NEUROCOMPUTING, 2014, 140 : 19 - 26