Estimates on learning rates for multi-penalty distribution regression

被引:0
|
作者
Yu, Zhan [1 ]
Ho, Daniel W.C. [2 ]
机构
[1] Department of Mathematics, Hong Kong Baptist University, 224 Waterloo Road, Kowloon Tong, Hong Kong
[2] Department of Mathematics, City University of Hong Kong, 83 Tat Chee Avenue, Kowloon Tong, Hong Kong
关键词
Regression analysis;
D O I
暂无
中图分类号
学科分类号
摘要
This paper is concerned with functional learning by utilizing two-stage sampled distribution regression. We study a multi-penalty regularization algorithm for distribution regression in the framework of learning theory. The algorithm aims at regressing to real-valued outputs from probability measures. The theoretical analysis of distribution regression is far from maturity and quite challenging since only second-stage samples are observable in practical settings. In our algorithm, to transform information of distribution samples, we embed the distributions to a reproducing kernel Hilbert space HK associated with Mercer kernel K via mean embedding technique. One of the primary contributions of this work is the introduction of a novel multi-penalty regularization algorithm, which is able to capture more potential features of distribution regression. Optimal learning rates of the algorithm are obtained under mild conditions. The work also derives learning rates for distribution regression in the hard learning scenario fρ∉HK, which has not been explored in the existing literature. Moreover, we propose a new distribution-regression-based distributed learning algorithm to face large-scale data or information challenges arising from distribution data. The optimal learning rates are derived for the distributed learning algorithm. By providing new algorithms and showing their learning rates, the work improves the existing literature in various aspects. © 2023 Elsevier Inc.
引用
收藏
相关论文
共 50 条
  • [31] MODIFIED NONPARAMETRIC KERNEL ESTIMATES OF A REGRESSION FUNCTION AND THEIR CONSISTENCIES WITH RATES
    SINGH, RS
    AHMAD, M
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 1987, 39 (03) : 549 - 562
  • [32] STRONG CONVERGENCE RATES OF THE LEAST SQUARES ESTIMATES IN REGRESSION MODELS
    朱力行
    ScienceBulletin, 1987, (17) : 1222 - 1223
  • [33] SAMPLING DISTRIBUTION OF LEAST ABSOLUTE RESIDUALS REGRESSION ESTIMATES
    ROSENBER.B
    CARLSON, D
    ECONOMETRICA, 1971, 39 (04) : 401 - &
  • [34] How to evaluate uncertainty estimates in machine learning for regression?
    Sluijterman, Laurens
    Cator, Eric
    Heskes, Tom
    NEURAL NETWORKS, 2024, 173
  • [35] Learning rates of regularized regression on the unit sphere
    CAO FeiLong
    LIN ShaoBo
    CHANG XiangYu
    XU ZongBen
    Science China(Mathematics), 2013, 56 (04) : 858 - 873
  • [36] Learning rates of regularized regression on the unit sphere
    Cao FeiLong
    Lin ShaoBo
    Chang XiangYu
    Xu ZongBen
    SCIENCE CHINA-MATHEMATICS, 2013, 56 (04) : 861 - 876
  • [37] Learning rates of regularized regression on the unit sphere
    FeiLong Cao
    ShaoBo Lin
    XiangYu Chang
    ZongBen Xu
    Science China Mathematics, 2013, 56 : 861 - 876
  • [38] LEARNING RATES OF REGULARIZED REGRESSION FOR FUNCTIONAL DATA
    Xu, Yong-Li
    Chen, Di-Rong
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2009, 7 (06) : 839 - 850
  • [39] Random multi-scale kernel-based Bayesian distribution regression learning
    Dong, Xue-Mei
    Gu, Yin-He
    Shi, Jian
    Xiang, Kun
    KNOWLEDGE-BASED SYSTEMS, 2020, 201
  • [40] OPTIMAL GLOBAL RATES OF CONVERGENCE OF M-ESTIMATES FOR NONPARAMETRIC REGRESSION
    SHI Peide
    LI Guoying (Institute of Systems Science
    Systems Science and Mathematical Sciences, 1995, (01) : 57 - 65