Optimised one-class classification performance

被引:0
|
作者
Lenz, Oliver Urs [1 ]
Peralta, Daniel [2 ]
Cornelis, Chris [1 ]
机构
[1] Univ Ghent, Dept Appl Math Comp Sci & Stat, Ghent, Belgium
[2] Univ Ghent, Dept Informat Technol, IDLab, IMEC, Ghent, Belgium
关键词
Data descriptors; Hyperparameter optimisation; Novelty detection; One-class classification; Semi-supervised outlier detection; SUPPORT; CLASSIFIERS;
D O I
10.1007/s10994-022-06147-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We provide a thorough treatment of one-class classification with hyperparameter optimisation for five data descriptors: Support Vector Machine (SVM), Nearest Neighbour Distance (NND), Localised Nearest Neighbour Distance (LNND), Local Outlier Factor (LOF) and Average Localised Proximity (ALP). The hyperparameters of SVM and LOF have to be optimised through cross-validation, while NND, LNND and ALP allow an efficient form of leave-one-out validation and the reuse of a single nearest-neighbour query. We experimentally evaluate the effect of hyperparameter optimisation with 246 classification problems drawn from 50 datasets. From a selection of optimisation algorithms, the recent Malherbe-Powell proposal optimises the hyperparameters of all data descriptors most efficiently. We calculate the increase in test AUROC and the amount of overfitting as a function of the number of hyperparameter evaluations. After 50 evaluations, ALP and SVM significantly outperform LOF, NND and LNND, and LOF and NND outperform LNND. The performance of ALP and SVM is comparable, but ALP can be optimised more efficiently so constitutes a good default choice. Alternatively, using validation AUROC as a selection criterion between ALP or SVM gives the best overall result, and NND is the least computationally demanding option. We thus end up with a clear trade-off between three choices, allowing practitioners to make an informed decision.
引用
收藏
页码:2863 / 2883
页数:21
相关论文
共 50 条
  • [21] Kernel whitening for one-class classification
    Tax, DMJ
    Juszczak, P
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2003, 17 (03) : 333 - 347
  • [22] Instance reduction for one-class classification
    Krawczyk, Bartosz
    Triguero, Isaac
    Garcia, Salvador
    Wozniak, Michal
    Herrera, Francisco
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2019, 59 (03) : 601 - 628
  • [23] One-class SVMs for document classification
    Manevitz, LM
    Yousef, M
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2002, 2 (02) : 139 - 154
  • [24] One-Class Classification with Gaussian Processes
    Kemmler, Michael
    Rodner, Erik
    Denzler, Joachim
    [J]. COMPUTER VISION - ACCV 2010, PT II, 2011, 6493 : 489 - 500
  • [25] One-class classification for oil spill detection
    Gambardella, Attilio
    Giacinto, Giorgio
    Migliaccio, Maurizio
    Montali, Andrea
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2010, 13 (03) : 349 - 366
  • [26] Document Classification with One-class Multiview Learning
    Chen, Bin
    Li, Bin
    Pan, Zhisong
    Feng, Aimin
    [J]. 2009 INTERNATIONAL CONFERENCE ON INDUSTRIAL AND INFORMATION SYSTEMS, PROCEEDINGS, 2009, : 289 - +
  • [27] Filter Feature Selection for One-Class Classification
    Luiz H N Lorena
    André C P L F Carvalho
    Ana C Lorena
    [J]. Journal of Intelligent & Robotic Systems, 2015, 80 : 227 - 243
  • [28] Fast structural ensemble for One-Class Classification
    Liu, Jiachen
    Miao, Qiguang
    Sun, Yanan
    Song, Jianfeng
    Quan, Yining
    [J]. PATTERN RECOGNITION LETTERS, 2016, 80 : 179 - 187
  • [29] A universal steganalysis based on one-class classification
    Zhou, Zhiping
    Zhang, Xiaoxiang
    Chen, Zongmin
    [J]. Journal of Computational Information Systems, 2010, 6 (09): : 2941 - 2948
  • [30] Rigorous and compliant approaches to one-class classification
    Rodionova, Oxana Ye.
    Oliveri, Paolo
    Pomerantsev, Alexey L.
    [J]. CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2016, 159 : 89 - 96