DOMAIN ADAPTATION BY ITERATIVE IMPROVEMENT OF SOFT-LABELING AND MAXIMIZATION OF NON-PARAMETRIC MUTUAL INFORMATION

被引:0
|
作者
Khan, M. N. A. [1 ]
Heisterkamp, Douglas R. [1 ]
机构
[1] Oklahoma State Univ, Dept Comp Sci, Stillwater, OK 74078 USA
关键词
Mutual information; soft-labeling; sub-space; transfer learning;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Domain adaptation (DA) algorithms address the problem of distribution shift between training and testing data. Recent approaches transform data into a shared subspace by minimizing the shift between their marginal distributions. We propose a method to learn a common subspace that will leverage the class conditional distributions of training samples along with reducing the marginal distribution shift. To learn the subspace, we employ a supervised technique based on non parametric mutual information by inducing soft label assignment for the unlabeled test data. The approach presents an iterative linear transformation for subspace learning by repeatedly updating test data predictions via soft-labeling and consequently improving the subspace with maximization of mutual information. A set of comprehensive experiments on benchmark datasets is conducted to prove the efficacy of our novel framework over state-of-the-art approaches.
引用
收藏
页码:4458 / 4462
页数:5
相关论文
共 16 条
  • [1] Dimensionality reduction based on non-parametric mutual information
    Faivishevsky, Lev
    Goldberger, Jacob
    NEUROCOMPUTING, 2012, 80 : 31 - 37
  • [2] Non-parametric estimation of copula based mutual information
    Krishnankutty, Baby Alpettiyil
    Ganapathy, Rajesh
    Sankaran, Paduthol Godan
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2020, 49 (06) : 1513 - 1527
  • [3] INVESTIGATING BIAS IN NON-PARAMETRIC MUTUAL INFORMATION ESTIMATION
    Zhu, Jie
    Bellanger, Jean-Jacques
    Shu, Huazhong
    Jeannes, Regine Le Bouquin
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 3971 - 3975
  • [4] DOMAIN ADAPTATION VIA MUTUAL INFORMATION MAXIMIZATION FOR HANDWRITING RECOGNITION
    Tang, Pei
    Peng, Liangrui
    Yan, Ruijie
    Shi, Haodong
    Yao, Gang
    Liu, Changsong
    Li, Jie
    Zhang, Yuqi
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 2300 - 2304
  • [5] Feature Selection with Non-Parametric Mutual Information for Adaboost Learning
    Baro, Xavier
    Vitria, Jordi
    ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2005, 131 : 131 - 138
  • [6] UNSUPERVISED FEATURE SELECTION BASED ON NON-PARAMETRIC MUTUAL INFORMATION
    Faivishevsky, Lev
    Goldberger, Jacob
    2012 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2012,
  • [7] Non-Parametric Estimation of Mutual Information through the Entropy of the Linkage
    Giraudo, Maria Teresa
    Sacerdote, Laura
    Sirovich, Roberta
    ENTROPY, 2013, 15 (12) : 5154 - 5177
  • [8] Non-Parametric Unsupervised Domain Adaptation for Neural Machine Translation
    Zheng, Xin
    Zhang, Zhirui
    Huang, Shujian
    Chen, Boxing
    Xie, Jun
    Luo, Weihua
    Chen, Jiajun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 4234 - 4241
  • [9] Non-parametric Estimation of Mutual Information with Application to Nonlinear Optical Fibers
    Catuogno, Tommaso
    Camara, Menelaos Ralli
    Secondini, Marco
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 736 - 740
  • [10] Mutual information based non-parametric transform stereo matching algorithm
    Lai X.-B.
    Zhu S.-Q.
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2011, 45 (09): : 1636 - 1642