An Anticorrelation Kernel for Subsystem Training in Multiple Classifier Systems

被引:0
|
作者
Ferrer, Luciana [1 ]
Sonmez, Kemal [2 ]
Shriberg, Elizabeth [1 ]
机构
[1] SRI Int, Speech Technol & Res Lab, Menlo Pk, CA 94025 USA
[2] Oregon Hlth & Sci Univ, Div Biomed Comp Sci, Sch Med, Portland, OR 97239 USA
关键词
system combination; ensemble diversity; multiple classifier systems; support vector machines; speaker recognition; kernel methods; SPEAKER; RECOGNITION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a method for training support vector machine (SVM)-based classification systems for combination with other classification systems designed for the same task. Ideally, a new system should be designed such that, when combined with existing systems, the resulting performance is optimized. We present a simple model for this problem and use the understanding gained from this analysis to propose a method to achieve better combination performance when training SVM systems. We include a regularization term in the SVM objective function that aims to reduce the average class-conditional covariance between the resulting scores and the scores produced by the existing systems, introducing a trade-off between such covariance and the system's individual performance. That is, the new system "takes one for the team", falling somewhat short of its best possible performance in order to increase the diversity of the ensemble. We report results on the NIST 2005 and 2006 speaker recognition evaluations (SREs) for a variety of subsystems. We show a gain of 19% on the equal error rate (EER) of a combination of four systems when applying the proposed method with respect to the performance obtained when the four systems are trained independently of each other.
引用
收藏
页码:2079 / 2114
页数:36
相关论文
共 50 条
  • [1] An anticorrelation kernel for subsystem training in multiple classifier systems
    Ferrer, Luciana
    Sönmez, Kemal
    Shriberg, Elizabeth
    Journal of Machine Learning Research, 2009, 10 : 2079 - 2114
  • [2] Training multilayer perceptron with multiple classifier systems
    Zhu, H
    Liu, JF
    Tang, XL
    Huang, JH
    ADVANCES IN NEURAL NETWORKS - ISNN 2004, PT 1, 2004, 3173 : 894 - 899
  • [3] On the Effectiveness of Diversity When Training Multiple Classifier Systems
    Gacquer, David
    Delcroix, Veronique
    Delmotte, Francois
    Piechowiak, Sylvain
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, PROCEEDINGS, 2009, 5590 : 493 - +
  • [4] Filter-Based Data Partitioning for Training Multiple Classifier Systems
    Dara, Rozita A.
    Makrehchi, Masoud
    Kamel, Mohamed S.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (04) : 508 - 522
  • [5] Joint Learning of Distance Metric and Kernel Classifier via Multiple Kernel Learning
    Zhang, Weiqi
    Yan, Zifei
    Zhang, Hongzhi
    Zuo, Wangmeng
    PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 586 - 600
  • [6] Multiple classifier systems for classification of multifrequency PolSAR images with limited training samples
    Khosravi, Iman
    Safari, Abdolreza
    Homayouni, Saeid
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2018, 39 (21) : 7547 - 7567
  • [7] Training set selection and swarm intelligence for enhanced integration in multiple classifier systems
    Mohammed, Amgad M.
    Onieva, Enrique
    Wozniak, Michal
    APPLIED SOFT COMPUTING, 2020, 95
  • [8] Using co-training and self-training in semi-supervised multiple classifier systems
    Didaci, Luca
    Roli, Fabio
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, PROCEEDINGS, 2006, 4109 : 522 - 530
  • [9] New measure of classifier dependency in multiple classifier systems
    Ruta, D
    Gabrys, B
    MULTIPLE CLASSIFIER SYSTEMS, 2002, 2364 : 127 - 136
  • [10] Multiple Feature Kernel Sparse Representation Classifier for Hyperspectral Imagery
    Gan, Le
    Xia, Junshi
    Du, Peijun
    Chanussot, Jocelyn
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (09): : 5343 - 5356