Feature selection for fusion of speaker verification via Maximum Kullback-Leibler Distance

被引:0
|
作者
Liu, Di [1 ]
Sun, Dong-Mei [1 ]
Qiu, Zheng-Ding [1 ]
机构
[1] Beijing Jiaotong Univ, Inst Informat Sci, Beijing, Peoples R China
关键词
Maximum Kullback-Leibler distance; feature selection; speaker verification; PHASE; MFCC;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper proposes an optimal scheme of feature selection for the fusion technique of speaker verification by Maximum Kullback-Leibler Distance. Through the investigations of hi-feature fusion schemes by six acoustic features, the information content of each fusion scheme via the Maximum Kullback-Leibler distance are computed in turn. The advantage of this distance is to overcome the shortcoming of the asymmetry of conventional Kullback-Leibler distance. This can keep the stability and correctness of the computation of the information content. In the experimental section, NIST 2001 corpus is used for evaluation. From the computation results by a variety of fusion schemes, found that the fusion between MFCC and residual phase hold the most information content. It indicates this scheme is able to yield an excellent performance. To verify its correctness, the EER evaluations are conducted. From the evaluation results, the EER of the fusion between MFCC and residual phase outperforms other fusion schemes. Therefore, the Maximum Kullback-Leibler distance can be considered as an effective metric for the feature selection in the fusion of speaker verification.
引用
收藏
页码:565 / 568
页数:4
相关论文
共 50 条
  • [1] Correcting the Kullback-Leibler distance for feature selection
    Coetzee, FM
    PATTERN RECOGNITION LETTERS, 2005, 26 (11) : 1675 - 1683
  • [2] THE KULLBACK-LEIBLER DISTANCE
    KULLBACK, S
    AMERICAN STATISTICIAN, 1987, 41 (04): : 340 - 340
  • [3] Feature Selection based on the Kullback-Leibler Distance and its application on fault diagnosis
    Xue, Yangtao
    Zhang, Li
    Wang, Bangjun
    Li, Fanzhang
    2019 SEVENTH INTERNATIONAL CONFERENCE ON ADVANCED CLOUD AND BIG DATA (CBD), 2019, : 246 - 251
  • [4] Linear feature vector compression using Kullback-Leibler distance
    Crysandt, Holger
    2006 IEEE INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND INFORMATION TECHNOLOGY, VOLS 1 AND 2, 2006, : 556 - 561
  • [5] I-VECTOR KULLBACK-LEIBLER DIVISIVE NORMALIZATION FOR PLDA SPEAKER VERIFICATION
    Pan, Yilin
    Zheng, Tieran
    Chen, Chen
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 56 - 60
  • [6] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [7] The centroid of the symmetrical Kullback-Leibler distance
    Veldhuis, R
    IEEE SIGNAL PROCESSING LETTERS, 2002, 9 (03) : 96 - 99
  • [8] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [9] Consistent estimator for basis selection based on a proxy of the Kullback-Leibler distance
    Dias, Ronaldo
    Garcia, Nancy L.
    JOURNAL OF ECONOMETRICS, 2007, 141 (01) : 167 - 178
  • [10] Using Kullback-Leibler distance for text categorization
    Bigi, B
    ADVANCES IN INFORMATION RETRIEVAL, 2003, 2633 : 305 - 319