High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion

被引:0
|
作者
Chen, Xin [1 ]
Deng, Chang [2 ]
He, Shuaida [1 ]
Wu, Runxiong [3 ]
Zhang, Jia [4 ]
机构
[1] Southern Univ Sci & Technol, Dept Stat & Data Sci, Shenzhen, Peoples R China
[2] Univ Chicago, Booth Sch Business, Chicago, IL USA
[3] Univ Calif Davis, Coll Engn, Davis, CA USA
[4] Southwestern Univ Finance & Econ, Joint Lab Data Sci & Business Intelligence, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Hilbert-Schmidt independence criterion; Single-index models; Large p small n; Majorization-minimization; Sufficient dimension reduction; Variable selection; SLICED INVERSE REGRESSION; ALTERNATING DIRECTION METHOD; SUFFICIENT DIMENSION; ADAPTIVE ESTIMATION; CENTRAL SUBSPACE; REDUCTION; MULTIPLIERS; RATES;
D O I
10.1007/s11222-024-10399-4
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hilbert-Schmidt Independence Criterion (HSIC) has recently been introduced to the field of single-index models to estimate the directions. Compared with other well-established methods, the HSIC based method requires relatively weak conditions. However, its performance has not yet been studied in the prevalent high-dimensional scenarios, where the number of covariates can be much larger than the sample size. In this article, based on HSIC, we propose to estimate the possibly sparse directions in the high-dimensional single-index models through a parameter reformulation. Our approach estimates the subspace of the direction directly and performs variable selection simultaneously. Due to the non-convexity of the objective function and the complexity of the constraints, a majorize-minimize algorithm together with the linearized alternating direction method of multipliers is developed to solve the optimization problem. Since it does not involve the inverse of the covariance matrix, the algorithm can naturally handle large p small n scenarios. Through extensive simulation studies and a real data analysis, we show that our proposal is efficient and effective in the high-dimensional settings. The Matlab\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\texttt {Matlab}$$\end{document} codes for this method are available online.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Lsotonic single-index model for high-dimensional database marketing
    Naik, PA
    Tsai, CL
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2004, 47 (04) : 775 - 790
  • [42] Learning Spatial Regularization Correlation Filters With the Hilbert-Schmidt Independence Criterion in RKHS for UAV Tracking
    An, Zhiyong
    Wang, Xiumin
    Li, Bo
    Fu, Jingyi
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [43] Two-Stage Fuzzy Multiple Kernel Learning Based on Hilbert-Schmidt Independence Criterion
    Wang, Tinghua
    Lu, Jie
    Zhang, Guangquan
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2018, 26 (06) : 3703 - 3714
  • [44] Hilbert-Schmidt Independence Criterion Subspace Learning on Hybrid Region Covariance Descriptor for Image Classification
    Liu, Xi
    Yang, Peng
    Zhan, Zengrong
    Ma, Zhengming
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [45] Fast and Scalable Feature Selection for Gene Expression Data Using Hilbert-Schmidt Independence Criterion
    Gangeh, Mehrdad J.
    Zarkoob, Hadi
    Ghodsi, Ali
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2017, 14 (01) : 167 - 181
  • [46] Hilbert-Schmidt Independence Criterion Lasso Feature Selection in Parkinson's Disease Detection System
    Wiharto, Wiharto
    Sucipto, Ahmad
    Salamah, Umi
    INTERNATIONAL JOURNAL OF FUZZY LOGIC AND INTELLIGENT SYSTEMS, 2023, 23 (04) : 482 - 499
  • [47] Memory Effects in High-Dimensional Systems Faithfully Identified by Hilbert-Schmidt Speed-Based Witness
    Mahdavipour, Kobra
    Shadfar, Mahshid Khazaei
    Jahromi, Hossein Rangani
    Morandotti, Roberto
    Lo Franco, Rosario
    ENTROPY, 2022, 24 (03)
  • [48] Ultra-High Dimensional Single-Index Quantile Regression
    Zhang, Yuankun
    Lian, Heng
    Yu, Yan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [49] Ultra-High dimensional single-index quantile regression
    Zhang, Yuankun
    Lian, Heng
    Yu, Yan
    1600, Microtome Publishing (21):
  • [50] Robust Variable Selection Based on Penalized Composite Quantile Regression for High-Dimensional Single-Index Models
    Song, Yunquan
    Li, Zitong
    Fang, Minglu
    MATHEMATICS, 2022, 10 (12)