Functional sufficient dimension reduction through information maximization with application to classification

被引:0
|
作者
Li, Xinyu [1 ]
Xu, Jianjun [2 ,4 ]
Cheng, Haoyang [3 ]
机构
[1] Univ Sci & Technol China, Int Inst Finance, Sch Management, Hefei, Anhui, Peoples R China
[2] Hefei Univ Technol, Sch Math, Hefei, Anhui, Peoples R China
[3] Quzhou Univ, Coll Elect & Informat Engn, Quzhou, Zhejiang, Peoples R China
[4] Hefei Univ Technol, Sch Math, Hefei 230601, Anhui, Peoples R China
基金
中国博士后科学基金;
关键词
Functional data classification; functional sufficient dimension reduction; mutual information; square loss mutual information; density ratio; REGRESSION; RECOGNITION; DIVERGENCE; PREDICTION; MORTALITY; COHORT; RATES;
D O I
10.1080/02664763.2024.2335570
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Considering the case where the response variable is a categorical variable and the predictor is a random function, two novel functional sufficient dimensional reduction (FSDR) methods are proposed based on mutual information and square loss mutual information. Compared to the classical FSDR methods, such as functional sliced inverse regression and functional sliced average variance estimation, the proposed methods are appealing because they are capable of estimating multiple effective dimension reduction directions in the case of a relatively small number of categories, especially for the binary response. Moreover, the proposed methods do not require the restrictive linear conditional mean assumption and the constant covariance assumption. They avoid the inverse problem of the covariance operator which is often encountered in the functional sufficient dimension reduction. The functional principal component analysis with truncation be used as a regularization mechanism. Under some mild conditions, the statistical consistency of the proposed methods is established. Simulation studies and real data analyzes are used to evaluate the finite sample properties of our methods.
引用
收藏
页码:3059 / 3101
页数:43
相关论文
共 50 条
  • [31] Principal weighted logistic regression for sufficient dimension reduction in binary classification
    Kim, Boyoung
    Shin, Seung Jun
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2019, 48 (02) : 194 - 206
  • [32] DIMENSIONALITY REDUCTION FOR IMAGE CLASSIFICATION VIA MUTUAL INFORMATION MAXIMIZATION
    Huang, Shuai
    Tran, Trac D.
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 509 - 513
  • [33] Tensor sufficient dimension reduction
    Zhong, Wenxuan
    Xing, Xin
    Suslick, Kenneth
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2015, 7 (03): : 178 - 184
  • [34] Transformed sufficient dimension reduction
    Wang, T.
    Guo, X.
    Zhu, L.
    Xu, P.
    BIOMETRIKA, 2014, 101 (04) : 815 - 829
  • [35] A note on sufficient dimension reduction
    Wen, Xuerong Meggie
    STATISTICS & PROBABILITY LETTERS, 2007, 77 (08) : 817 - 821
  • [36] Advance of the sufficient dimension reduction
    Hang, Weiqiang
    Xia, Yingcun
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2021, 13 (03):
  • [37] Sparse sufficient dimension reduction
    Li, Lexin
    BIOMETRIKA, 2007, 94 (03) : 603 - 613
  • [38] Integrating discriminant and descriptive information for dimension reduction and classification
    Yu, Jie
    Tian, Qi
    Rui, Ting
    Huang, Thomas S.
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2007, 17 (03) : 372 - 377
  • [39] High-dimensional sufficient dimension reduction through principal projections
    Pircalabelu, Eugen
    Artemiou, Andreas
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01): : 1804 - 1830
  • [40] Principal weighted support vector machines for sufficient dimension reduction in binary classification
    Shin, Seung Jun
    Wu, Yichao
    Zhang, Hao Helen
    Liu, Yufeng
    BIOMETRIKA, 2017, 104 (01) : 67 - 81