Density weighted support vector data description

被引:94
|
作者
Cha, Myungraee [1 ]
Kim, Jun Seok [1 ]
Baek, Jun-Geol [1 ]
机构
[1] Korea Univ, Sch Ind Management Engn, Seoul 136701, South Korea
基金
新加坡国家研究基金会;
关键词
One-class classification (OCC); Support vector data description (SVDD); Density weighted SVDD (DW-SVDD); k-Nearest neighbor approach; OUTLIER DETECTION; CLASSIFICATION;
D O I
10.1016/j.eswa.2013.11.025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One-class classification (OCC) has received a lot of attention because of its usefulness in the absence of statistically-representative non-target data. In this situation, the objective of OCC is to find the optimal description of the target data in order to better identify outlier or non-target data. An example of OCC, support vector data description (SVDD) is widely used for its flexible description boundaries without the need to make assumptions regarding data distribution. By mapping the target dataset into high-dimensional space, SVDD finds the spherical description boundary for the target data. In this process, SVDD considers only the kernel-based distance between each data point and the spherical description, not the density distribution of the data. Therefore, it may happen that data points in high-density regions are not included in the description, decreasing classification performance. To solve this problem, we propose a new SVDD introducing the notion of density weight, which is the relative density of each data point based on the density distribution of the target data using the k-nearest neighbor (k-NN) approach. Incorporating the new weight into the search for an optimal description using SVDD, this new method prioritizes data points in high-density regions, and eventually the optimal description shifts to these regions. We demonstrate the improved performance of the new SVDD by using various datasets from the UCI repository. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:3343 / 3350
页数:8
相关论文
共 50 条
  • [31] Contrastive deep support vector data description
    Xing, Hong-Jie
    Zhang, Ping -Ping
    PATTERN RECOGNITION, 2023, 143
  • [32] Fast distant support vector data description
    Ling, Ping
    You, Xiangyang
    Gao, Dajin
    Gao, Tao
    Li, Xue
    MEMETIC COMPUTING, 2017, 9 (01) : 3 - 14
  • [33] Support vector data description with manifold embedding
    Chen, Bin
    Li, Bin
    Pan, Zhi-Song
    Chen, Song-Can
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2009, 22 (04): : 548 - 553
  • [34] Multimodal subspace support vector data description
    Sohrab, Fahad
    Raitoharju, Jenni
    Iosifidis, Alexandros
    Gabbouj, Moncef
    PATTERN RECOGNITION, 2021, 110
  • [35] A Robust Support Vector Data Description Classifier
    Liu, Fu
    Hou, Tao
    Zou, QingYu
    2013 32ND CHINESE CONTROL CONFERENCE (CCC), 2013, : 3781 - 3784
  • [36] Incremental Learning with Support Vector Data Description
    Xie, Weiyi
    Uhlmann, Stefan
    Kiranyaz, Serkan
    Gabbouj, Moncef
    2014 22ND INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2014, : 3904 - 3909
  • [37] Fast distant support vector data description
    Ping Ling
    Xiangyang You
    Dajin Gao
    Tao Gao
    Xue Li
    Memetic Computing, 2017, 9 : 3 - 14
  • [38] Ramp Loss Support Vector Data Description
    Xuanthanh, Vo
    Bach, Tran
    Hoai An Le Thi
    Tao Pham Dinh
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2017, PT I, 2017, 10191 : 421 - 431
  • [39] Deep learning with support vector data description
    Kim, Sangwook
    Choi, Yonghwa
    Lee, Minho
    NEUROCOMPUTING, 2015, 165 : 111 - 117
  • [40] Ellipsoidal Subspace Support Vector Data Description
    Sohrab, Fahad
    Raitoharju, Jenni
    Iosifidis, Alexandros
    Gabbouj, Moncef
    IEEE ACCESS, 2020, 8 : 122013 - 122025