Group analysis of fMRI data using L1 and L2 regularization

被引:0
|
作者
Overholser, Rosanna
Xu, Ronghui
机构
[1] 9500 Gilman Drive, MC 0112, La Jolla, 92093-0112, CA
关键词
Correlated curves; Correlated errors; Functional linear model; Penalized splines; Semiparametric mixed-effects model; Voxel level analysis; MIXED-EFFECTS MODELS; NONCONCAVE PENALIZED LIKELIHOOD; VARIABLE SELECTION; KNOT SELECTION; REGRESSION SPLINES; LINEAR-MODELS; ASYMPTOTICS; SHRINKAGE;
D O I
10.4310/SII.2015.v8.n3.a11
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
In clinical studies using functional magnetic resonance imaging (fMRI), it is of interest to compare multiple subjects from different groups. We investigate the analysis of such data using random effects and non-parametric estimation of mean activation curves. The random effects modeling replaces the existing approach in fMRI literature where each curve is 'normalized' by a percent change. For the mean curves we consider smoothing via splines using L-1 or L-2 regularization. Our general framework allows analysis of fMRI curves that are correlated, and with correlated within curve errors. We describe a unified algorithm that uses existing software to carry out the estimation. The different regularization approaches are compared using simulation. We apply the method to an fMRI study about the effects of caffeine on the motor cortex of the brain, and discuss the limitation on currently available computing resources for carrying out such analysis on very large data sets.
引用
收藏
页码:379 / 390
页数:12
相关论文
共 50 条
  • [1] αl1 - βl2 regularization for sparse recovery
    Ding, Liang
    Han, Weimin
    [J]. INVERSE PROBLEMS, 2019, 35 (12)
  • [2] Stochastic PCA with l2 and l1 Regularization
    Mianjy, Poorya
    Arora, Raman
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [3] Combined l2 data and gradient fitting in conjunction with l1 regularization
    Didas, Stephan
    Setzer, Simon
    Steidl, Gabriele
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2009, 30 (01) : 79 - 99
  • [4] Group Sparsity Residual Constraint Image Denoising Model with l1/l2 Regularization
    WU Di
    ZHANG Tao
    MO Xutao
    [J]. Wuhan University Journal of Natural Sciences, 2023, 28 (01) : 53 - 60
  • [5] The Group-Lasso: l1,∞ Regularization versus l1,2 Regularization
    Vogt, Julia E.
    Roth, Volker
    [J]. PATTERN RECOGNITION, 2010, 6376 : 252 - 261
  • [6] Sentiment Analysis of Tweets by Convolution Neural Network with L1 and L2 Regularization
    Rangra, Abhilasha
    Sehgal, Vivek Kumar
    Shukla, Shailendra
    [J]. ADVANCED INFORMATICS FOR COMPUTING RESEARCH, ICAICR 2018, PT I, 2019, 955 : 355 - 365
  • [7] An l2/l1 regularization framework for diverse learning tasks
    Wang, Shengzheng
    Peng, Jing
    Liu, Wei
    [J]. SIGNAL PROCESSING, 2015, 109 : 206 - 211
  • [8] Calculating the Galois group of L1(L2(y))=0, L1,L2 completely reducible operators
    Berman, PH
    Singer, MF
    [J]. JOURNAL OF PURE AND APPLIED ALGEBRA, 1999, 139 (1-3) : 3 - 23
  • [9] Facilitating L2 Reading Comprehension Through L1 and L2 Group Discussions
    Almalki, Abdulrahman
    Alzahrani, Mohammed
    [J]. READING IN A FOREIGN LANGUAGE, 2024, 36 (01): : 1 - 24
  • [10] 3-D inversion of magnetic data based on the L1–L2 norm regularization
    Mitsuru Utsugi
    [J]. Earth, Planets and Space, 71