Optimised one-class classification performance

被引:0
|
作者
Lenz, Oliver Urs [1 ]
Peralta, Daniel [2 ]
Cornelis, Chris [1 ]
机构
[1] Univ Ghent, Dept Appl Math Comp Sci & Stat, Ghent, Belgium
[2] Univ Ghent, Dept Informat Technol, IDLab, IMEC, Ghent, Belgium
关键词
Data descriptors; Hyperparameter optimisation; Novelty detection; One-class classification; Semi-supervised outlier detection; SUPPORT; CLASSIFIERS;
D O I
10.1007/s10994-022-06147-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We provide a thorough treatment of one-class classification with hyperparameter optimisation for five data descriptors: Support Vector Machine (SVM), Nearest Neighbour Distance (NND), Localised Nearest Neighbour Distance (LNND), Local Outlier Factor (LOF) and Average Localised Proximity (ALP). The hyperparameters of SVM and LOF have to be optimised through cross-validation, while NND, LNND and ALP allow an efficient form of leave-one-out validation and the reuse of a single nearest-neighbour query. We experimentally evaluate the effect of hyperparameter optimisation with 246 classification problems drawn from 50 datasets. From a selection of optimisation algorithms, the recent Malherbe-Powell proposal optimises the hyperparameters of all data descriptors most efficiently. We calculate the increase in test AUROC and the amount of overfitting as a function of the number of hyperparameter evaluations. After 50 evaluations, ALP and SVM significantly outperform LOF, NND and LNND, and LOF and NND outperform LNND. The performance of ALP and SVM is comparable, but ALP can be optimised more efficiently so constitutes a good default choice. Alternatively, using validation AUROC as a selection criterion between ALP or SVM gives the best overall result, and NND is the least computationally demanding option. We thus end up with a clear trade-off between three choices, allowing practitioners to make an informed decision.
引用
收藏
页码:2863 / 2883
页数:21
相关论文
共 50 条
  • [1] Optimised one-class classification performance
    Oliver Urs Lenz
    Daniel Peralta
    Chris Cornelis
    [J]. Machine Learning, 2022, 111 : 2863 - 2883
  • [2] Resampling approach for one-Class classification; Resampling approach for one-Class classification
    Lee, Hae-Hwan
    Park, Seunghwan
    Im, Jongho
    [J]. Pattern Recognition, 2023, 143
  • [3] One-Class Risk Estimation for One-Class Hyperspectral Image Classification
    Zhao, Hengwei
    Zhong, Yanfei
    Wang, Xinyu
    Shu, Hong
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [4] One-class classification with subgaussians
    Mamlouk, AM
    Kim, JT
    Barth, E
    Brauckmann, M
    Martinetz, T
    [J]. PATTERN RECOGNITION, PROCEEDINGS, 2003, 2781 : 346 - 353
  • [5] Structured one-class classification
    Wang, Defeng
    Yeung, Daniel S.
    Tsang, Eric C. C.
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (06): : 1283 - 1295
  • [6] Overview of one-Class Classification
    Sun Wenzhu
    Hu Wenting
    Xue Zufeng
    Cao Jianping
    [J]. 2019 IEEE 4TH INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP 2019), 2019, : 6 - 10
  • [7] Deep One-Class Classification
    Ruff, Lukas
    Vandermeulen, Robert A.
    Goernitz, Nico
    Deecke, Lucas
    Siddiqui, Shoaib A.
    Binder, Alexander
    Mueller, Emmanuel
    Kloft, Marius
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] Active Learning for One-Class Classification Using Two One-Class Classifiers
    Schlachter, Patrick
    Yang, Bin
    [J]. 2018 26TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2018, : 1197 - 1201
  • [9] ON SIMPLE ONE-CLASS CLASSIFICATION METHODS
    Noumir, Zineb
    Honeine, Paul
    Richard, Cedric
    [J]. 2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [10] Resampling approach for one-Class classification
    Lee, Hae-Hwan
    Park, Seunghwan
    Im, Jongho
    [J]. PATTERN RECOGNITION, 2023, 143