Generalized Correntropy Induced Loss Function for Deep Learning

被引:0
|
作者
Chen, Liangjun [1 ]
Qu, Hua [1 ]
Zhao, Jihong [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Elect & Informat Engn, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Through multiple levels of abstraction, deep learning takes advantage of multiple layers models to find the complicated structure and learn the high level representations of data. In recent years, deep learning has made great progress in object detection, speech recognition, and many other domains. The robustness of learning systems with deep architectures is however rarely studied and needs further investigation. Especially, the mean square error(MSE), which is commonly used as optimization cost function in deep learning, is sensitive to outliers(or impulsive noises). To combat the harmful influences caused by outliers which are pervasive in many real world data, it is indispensable to improve the robustness in deep learning. In this paper, a robust deep learning method based on generalized correntropy is proposed and named generaliezed correntropy induced loss function(GC-loss) based SAE(GC-SAE). Generalized correntropy as a nonlinear measure of similarity is robust to outliers and can approximate different norms(from l(0) to l(2)) of data. By using generalized Gaussian density(GGD) function as its kernel, generalized correntropy achieves a more flexible shape and shows a better robustness for non-Gaussian noise when compared with the original correntropy with Gaussian kernel. The good robustness of the proposed method is confirmed by the experiments on MNIST benchmark dataset.
引用
收藏
页码:1428 / 1433
页数:6
相关论文
共 50 条
  • [21] DHA: Supervised Deep Learning to Hash with an Adaptive Loss Function
    Xu, Jiehao
    Guo, Chengyu
    Liu, Qingjie
    Qin, Jie
    Wang, Yunhong
    Liu, Li
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 3054 - 3062
  • [22] Learning deep discriminative features based on cosine loss function
    Wang, Jiabao
    Li, Yang
    Miao, Zhuang
    Xu, Yulong
    Tao, Gang
    [J]. ELECTRONICS LETTERS, 2017, 53 (14) : 918 - 919
  • [23] Generalized multikernel correntropy based broad learning system for robust regression
    Zheng, Yunfei
    Wang, Shiyuan
    Chen, Badong
    [J]. INFORMATION SCIENCES, 2024, 678
  • [24] Learning with the Maximum Correntropy Criterion Induced Losses for Regression
    Feng, Yunlong
    Huang, Xiaolin
    Shi, Lei
    Yang, Yuning
    Suykens, Johan A. K.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2015, 16 : 993 - 1034
  • [25] Marine Animal Classification With Correntropy-Loss-Based Multiview Learning
    Cao, Zheng
    Yu, Shujian
    Ouyang, Bing
    Dalgleish, Fraser
    Vuorenkoski, Anni
    Alsenas, Gabriel
    Principe, Jose C.
    [J]. IEEE JOURNAL OF OCEANIC ENGINEERING, 2019, 44 (04) : 1116 - 1129
  • [26] A Novel Objective Function Based on a Generalized Kelly Criterion for Deep Learning
    Fallah, Faezeh
    Tsanev, Doychin Mariyanov
    Yang, Bin
    Walter, Sven
    Bamberg, Fabian
    [J]. 2017 SIGNAL PROCESSING: ALGORITHMS, ARCHITECTURES, ARRANGEMENTS, AND APPLICATIONS (SPA 2017), 2017, : 84 - 89
  • [27] Semantic and Generalized Entropy Loss Functions for Semi-Supervised Deep Learning
    Gajowniczek, Krzysztof
    Liang, Yitao
    Friedman, Tal
    Zabkowski, Tomasz
    van den Broeck, Guy
    [J]. ENTROPY, 2020, 22 (03)
  • [28] CTSVM: A robust twin support vector machine with correntropy-induced loss function for binary classification problems
    Zheng, Xiaohan
    Zhang, Li
    Yan, Leilei
    [J]. INFORMATION SCIENCES, 2021, 559 : 22 - 45
  • [29] A robust projection twin support vector machine with a generalized correntropy-based loss
    Ren, Qiangqiang
    Yang, Liming
    [J]. APPLIED INTELLIGENCE, 2022, 52 (02) : 2154 - 2170
  • [30] A robust projection twin support vector machine with a generalized correntropy-based loss
    Qiangqiang Ren
    Liming Yang
    [J]. Applied Intelligence, 2022, 52 : 2154 - 2170