High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion

被引:0
|
作者
Chen, Xin [1 ]
Deng, Chang [2 ]
He, Shuaida [1 ]
Wu, Runxiong [3 ]
Zhang, Jia [4 ]
机构
[1] Southern Univ Sci & Technol, Dept Stat & Data Sci, Shenzhen, Peoples R China
[2] Univ Chicago, Booth Sch Business, Chicago, IL USA
[3] Univ Calif Davis, Coll Engn, Davis, CA USA
[4] Southwestern Univ Finance & Econ, Joint Lab Data Sci & Business Intelligence, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Hilbert-Schmidt independence criterion; Single-index models; Large p small n; Majorization-minimization; Sufficient dimension reduction; Variable selection; SLICED INVERSE REGRESSION; ALTERNATING DIRECTION METHOD; SUFFICIENT DIMENSION; ADAPTIVE ESTIMATION; CENTRAL SUBSPACE; REDUCTION; MULTIPLIERS; RATES;
D O I
10.1007/s11222-024-10399-4
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hilbert-Schmidt Independence Criterion (HSIC) has recently been introduced to the field of single-index models to estimate the directions. Compared with other well-established methods, the HSIC based method requires relatively weak conditions. However, its performance has not yet been studied in the prevalent high-dimensional scenarios, where the number of covariates can be much larger than the sample size. In this article, based on HSIC, we propose to estimate the possibly sparse directions in the high-dimensional single-index models through a parameter reformulation. Our approach estimates the subspace of the direction directly and performs variable selection simultaneously. Due to the non-convexity of the objective function and the complexity of the constraints, a majorize-minimize algorithm together with the linearized alternating direction method of multipliers is developed to solve the optimization problem. Since it does not involve the inverse of the covariance matrix, the algorithm can naturally handle large p small n scenarios. Through extensive simulation studies and a real data analysis, we show that our proposal is efficient and effective in the high-dimensional settings. The Matlab\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\texttt {Matlab}$$\end{document} codes for this method are available online.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] High-dimensional sparse single–index regression via Hilbert–Schmidt independence criterion
    Xin Chen
    Chang Deng
    Shuaida He
    Runxiong Wu
    Jia Zhang
    Statistics and Computing, 2024, 34
  • [2] DIRECTION ESTIMATION IN SINGLE-INDEX REGRESSIONS VIA HILBERT-SCHMIDT INDEPENDENCE CRITERION
    Zhang, Nan
    Yin, Xiangrong
    STATISTICA SINICA, 2015, 25 (02) : 743 - 758
  • [3] Sparse Hilbert-Schmidt Independence Criterion Regression
    Poignard, Benjamin
    Yamada, Makoto
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 538 - 547
  • [4] A Regression Perspective on Generalized Distance Covariance and the Hilbert-Schmidt Independence Criterion
    Edelmann, Dominic
    Goeman, Jelle
    STATISTICAL SCIENCE, 2022, 37 (04) : 562 - 579
  • [5] Robust Learning with the Hilbert-Schmidt Independence Criterion
    Greenfeld, Daniel
    Shalit, Uri
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [6] Sequence Alignment with the Hilbert-Schmidt Independence Criterion
    Campbell, Jordan
    Lewis, J. P.
    Seol, Yeongho
    PROCEEDINGS CVMP 2018: THE 15TH ACM SIGGRAPH EUROPEAN CONFERENCE ON VISUAL MEDIA PRODUCTION, 2018,
  • [7] Test of conditional independence in factor models via Hilbert-Schmidt independence criterion
    Xu, Kai
    Cheng, Qing
    JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 199
  • [8] Sensitivity maps of the Hilbert-Schmidt independence criterion
    Perez-Suay, Adrian
    Camps-Valls, Gustau
    APPLIED SOFT COMPUTING, 2018, 70 : 1054 - 1063
  • [9] Nystrom M -Hilbert-Schmidt Independence Criterion
    Kalinke, Florian
    Szabo, Zoltan
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 1005 - 1015
  • [10] Kernel Learning with Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Li, Wei
    He, Xianwen
    PATTERN RECOGNITION (CCPR 2016), PT I, 2016, 662 : 720 - 730