Functional sufficient dimension reduction through information maximization with application to classification

被引:0
|
作者
Li, Xinyu [1 ]
Xu, Jianjun [2 ,4 ]
Cheng, Haoyang [3 ]
机构
[1] Univ Sci & Technol China, Int Inst Finance, Sch Management, Hefei, Anhui, Peoples R China
[2] Hefei Univ Technol, Sch Math, Hefei, Anhui, Peoples R China
[3] Quzhou Univ, Coll Elect & Informat Engn, Quzhou, Zhejiang, Peoples R China
[4] Hefei Univ Technol, Sch Math, Hefei 230601, Anhui, Peoples R China
基金
中国博士后科学基金;
关键词
Functional data classification; functional sufficient dimension reduction; mutual information; square loss mutual information; density ratio; REGRESSION; RECOGNITION; DIVERGENCE; PREDICTION; MORTALITY; COHORT; RATES;
D O I
10.1080/02664763.2024.2335570
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Considering the case where the response variable is a categorical variable and the predictor is a random function, two novel functional sufficient dimensional reduction (FSDR) methods are proposed based on mutual information and square loss mutual information. Compared to the classical FSDR methods, such as functional sliced inverse regression and functional sliced average variance estimation, the proposed methods are appealing because they are capable of estimating multiple effective dimension reduction directions in the case of a relatively small number of categories, especially for the binary response. Moreover, the proposed methods do not require the restrictive linear conditional mean assumption and the constant covariance assumption. They avoid the inverse problem of the covariance operator which is often encountered in the functional sufficient dimension reduction. The functional principal component analysis with truncation be used as a regularization mechanism. Under some mild conditions, the statistical consistency of the proposed methods is established. Simulation studies and real data analyzes are used to evaluate the finite sample properties of our methods.
引用
收藏
页码:3059 / 3101
页数:43
相关论文
共 50 条
  • [1] Functional Sufficient Dimension Reduction for Functional Data Classification
    Wang, Guochang
    Song, Xinyuan
    JOURNAL OF CLASSIFICATION, 2018, 35 (02) : 250 - 272
  • [2] Functional Sufficient Dimension Reduction for Functional Data Classification
    Guochang Wang
    Xinyuan Song
    Journal of Classification, 2018, 35 : 250 - 272
  • [3] Functional sufficient dimension reduction through distance covariance
    Yang, Xing
    Xu, Jianjun
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2024, 94 (16) : 3678 - 3699
  • [4] FUNCTIONAL SUFFICIENT DIMENSION REDUCTION THROUGH AVERAGE FRECHET DERIVATIVES
    Lee, Kuang-Yao
    Li, Lexin
    ANNALS OF STATISTICS, 2022, 50 (02): : 904 - 929
  • [5] Sufficient dimension reduction with additional information
    Hung, Hung
    Liu, Chih-Yen
    Lu, Henry Horng-Shing
    BIOSTATISTICS, 2016, 17 (03) : 405 - 421
  • [6] Sufficient Dimension Reduction for Visual Sequence Classification
    Shyr, Alex
    Urtasun, Raquel
    Jordan, Michael I.
    2010 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2010, : 3610 - 3617
  • [7] Covariate Information Matrix for Sufficient Dimension Reduction
    Yao, Weixin
    Nandy, Debmalya
    Lindsay, Bruce G.
    Chiaromonte, Francesca
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (528) : 1752 - 1764
  • [8] Information preserving sufficient summaries for dimension reduction
    Nelson, David
    Noorbaloochi, Siamak
    JOURNAL OF MULTIVARIATE ANALYSIS, 2013, 115 : 347 - 358
  • [9] NONLINEAR SUFFICIENT DIMENSION REDUCTION FOR FUNCTIONAL DATA
    Li, Bing
    Song, Jun
    ANNALS OF STATISTICS, 2017, 45 (03): : 1059 - 1095
  • [10] Series expansion for functional sufficient dimension reduction
    Lian, Heng
    Li, Gaorong
    JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 124 : 150 - 165