Distributed optimization for penalized regression in massive compositional data

被引:0
|
作者
Chao, Yue [1 ,2 ]
Huang, Lei [3 ]
Ma, Xuejun [1 ]
机构
[1] Soochow Univ, Sch Math Sci, Dept Stat, Suzhou, Peoples R China
[2] Xiamen Univ, MOE Key Lab Econometr, WISE, Xiamen, Peoples R China
[3] Southwest Jiaotong Univ, Sch Math, Dept Stat, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Massive compositional data; Distributed optimization; Augmented Lagrangian; Coordinate-wise descent; Variable selection; Medical insurance; QUANTILE REGRESSION; VARIABLE SELECTION; LIKELIHOOD; ALGORITHMS;
D O I
10.1016/j.apm.2025.115950
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Compositional data have been widely used in various fields to analyze parts of a whole, providing insights into proportional relationships. With the increasing availability of extraordinarily large compositional datasets, addressing the challenges of distributed statistical methodologies and computations has become essential in the era of big data. This paper focuses on the optimization methodology and practical application of the distributed sparse penalized linear log- contrast model for massive compositional data, specifically in the context of medical insurance reimbursement ratio prediction. We propose two distributed optimization techniques tailored for centralized and decentralized topologies to effectively tackle the constrained convex optimization problems that arise in this application. Our algorithms are rooted in the frameworks of the alternating direction method of multipliers and the coordinate descent method of multipliers, making them available for distributed data scenarios. Notably, in the decentralized topology, we introduce a distributed coordinate-wise descent algorithm that employs a group alternating direction method of multipliers to achieve efficient distributed regularized estimation. We rigorously present convergence analysis for our decentralized algorithm, ensuring its reliability for practical applications. Through numerical experiments on both simulated datasets and a real- world medical insurance dataset, we evaluate the performance of our proposed algorithms.
引用
收藏
页数:23
相关论文
共 50 条
  • [1] Distributed Penalized Modal Regression for Massive Data
    Jun Jin
    Shuangzhe Liu
    Tiefeng Ma
    Journal of Systems Science and Complexity, 2023, 36 : 798 - 821
  • [2] Distributed Penalized Modal Regression for Massive Data
    JIN Jun
    LIU Shuangzhe
    MA Tiefeng
    Journal of Systems Science & Complexity, 2023, 36 (02) : 798 - 821
  • [3] Distributed Penalized Modal Regression for Massive Data
    Jin Jun
    Liu Shuangzhe
    Ma Tiefeng
    JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2023, 36 (02) : 798 - 821
  • [4] Distributed quantile regression for massive heterogeneous data
    Hu, Aijun
    Jiao, Yuling
    Liu, Yanyan
    Shi, Yueyong
    Wu, Yuanshan
    NEUROCOMPUTING, 2021, 448 : 249 - 262
  • [5] Robust distributed modal regression for massive data
    Wang, Kangning
    Li, Shaomin
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2021, 160
  • [6] Communication-Efficient Modeling with Penalized Quantile Regression for Distributed Data
    Hu, Aijun
    Li, Chujin
    Wu, Jing
    COMPLEXITY, 2021, 2021
  • [7] Communication-Efficient Modeling with Penalized Quantile Regression for Distributed Data
    Hu, Aijun
    Li, Chujin
    Wu, Jing
    Complexity, 2021, 2021
  • [8] Distributed optimization and statistical learning for large-scale penalized expectile regression
    Pan, Yingli
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 290 - 314
  • [9] Distributed optimization and statistical learning for large-scale penalized expectile regression
    Yingli Pan
    Journal of the Korean Statistical Society, 2021, 50 : 290 - 314
  • [10] Adaptive distributed support vector regression of massive data
    Liang, Shu-na
    Sun, Fei
    Zhang, Qi
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024, 53 (09) : 3365 - 3382