Sparse Regression in Cancer Genomics: Comparing Variable Selection and Predictions in Real World Data

被引:2
|
作者
O'Shea, Robert J. [1 ]
Tsoka, Sophia [2 ]
Cook, Gary J. R. [1 ,3 ,4 ]
Goh, Vicky [1 ,5 ]
机构
[1] Kings Coll London, Sch Biomed Engn & Imaging Sci, Dept Canc Imaging, 5th Floor,Becket House,1 Lambeth Palace Rd, London SE1 7EU, England
[2] Kings Coll London, Sch Nat & Math Sci, Dept Informat, London, England
[3] Kings Coll London, London, England
[4] St Thomas Hosp, Guys & St Thomas PET Ctr, London, England
[5] Guys & St Thomas NHS Fdn Trust, Dept Radiol, London, England
基金
英国工程与自然科学研究理事会;
关键词
Artificial intelligence; gene regulatory networks; models; statistical; computational biology; genomics; GENE-EXPRESSION OMNIBUS; MODEL SELECTION; LASSO; SUBSET; REGULARIZATION; OPTIMIZATION;
D O I
10.1177/11769351211056298
中图分类号
R73 [肿瘤学];
学科分类号
100214 ;
摘要
BACKGROUND: Evaluation of gene interaction models in cancer genomics is challenging, as the true distribution is uncertain. Previous analyses have benchmarked models using synthetic data or databases of experimentally verified interactions - approaches which are susceptible to misrepresentation and incompleteness, respectively. The objectives of this analysis are to (1) provide a real-world data-driven approach for comparing performance of genomic model inference algorithms, (2) compare the performance of LASSO, elastic net, best-subset selection, L0L1 penalisation and L0L2 penalisation in real genomic data and (3) compare algorithmic preselection according to performance in our benchmark datasets to algorithmic selection by internal cross-validation. METHODS: Five large (n approximate to 4000) genomic datasets were extracted from Gene Expression Omnibus. 'Gold-standard' regression models were trained on subspaces of these datasets (n approximate to 4000, p = 500 ). Penalised regression models were trained on small samples from these subspaces (n is an element of {25, 75, 150}, p = 500) and validated against the gold-standard models. Variable selection performance and out-of-sample prediction were assessed. Penalty 'preselection' according to test performance in the other 4 datasets was compared to selection internal cross-validation error minimisation. RESULTS: L1L2-penalisation achieved the highest cosine similarity between estimated coefficients and those of gold-standard models. L0L2-penalised models explained the greatest proportion of variance in test responses, though performance was unreliable in low signal:noise conditions. L0L2 also attained the highest overall median variable selection F1 score. Penalty preselection significantly outperformed selection by internal cross-validation in each of 3 examined metrics. CONCLUSIONS: This analysis explores a novel approach for comparisons of model selection approaches in real genomic data from 5 cancers. Our benchmarking datasets have been made publicly available for use in future research. Our findings support the use of L0L2 penalisation for structural selection and L1L2 penalisation for coefficient recovery in genomic data. Evaluation of learning algorithms according to observed test performance in external genomic datasets yields valuable insights into actual test performance, providing a data-driven complement to internal cross-validation in genomic regression tasks.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Variable selection for sparse logistic regression
    Zanhua Yin
    Metrika, 2020, 83 : 821 - 836
  • [2] Variable selection for sparse logistic regression
    Yin, Zanhua
    METRIKA, 2020, 83 (07) : 821 - 836
  • [3] Leveraging pleiotropic association using sparse group variable selection in genomics data
    Matthew Sutton
    Pierre-Emmanuel Sugier
    Therese Truong
    Benoit Liquet
    BMC Medical Research Methodology, 22
  • [4] Leveraging pleiotropic association using sparse group variable selection in genomics data
    Sutton, Matthew
    Sugier, Pierre-Emmanuel
    Truong, Therese
    Liquet, Benoit
    BMC MEDICAL RESEARCH METHODOLOGY, 2022, 22 (01)
  • [5] VARIABLE SELECTION IN SPARSE REGRESSION WITH QUADRATIC MEASUREMENTS
    Fan, Jun
    Kong, Lingchen
    Wang, Liqun
    Xiu, Naihua
    STATISTICA SINICA, 2018, 28 (03) : 1157 - 1178
  • [6] Adaptive Variable Selection in Nonparametric Sparse Regression
    Ingster Y.
    Stepanova N.
    Journal of Mathematical Sciences, 2014, 199 (2) : 184 - 201
  • [7] Sparse neural network regression with variable selection
    Shin, Jae-Kyung
    Bak, Kwan-Young
    Koo, Ja-Yong
    COMPUTATIONAL INTELLIGENCE, 2022, 38 (06) : 2075 - 2094
  • [8] Quantile function regression and variable selection for sparse models
    Yoshida, Takuma
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2021, 49 (04): : 1196 - 1221
  • [9] Variable Selection for Sparse Logistic Regression with Grouped Variables
    Zhong, Mingrui
    Yin, Zanhua
    Wang, Zhichao
    MATHEMATICS, 2023, 11 (24)
  • [10] Exhaustive Search for Sparse Variable Selection in Linear Regression
    Igarashi, Yasuhiko
    Takenaka, Hikaru
    Nakanishi-Ohno, Yoshinori
    Uemura, Makoto
    Ikeda, Shiro
    Okada, Masato
    JOURNAL OF THE PHYSICAL SOCIETY OF JAPAN, 2018, 87 (04)