Rates of convergence for Laplacian semi-supervised learning with low labeling rates

被引:7
|
作者
Calder, Jeff [1 ]
Slepcev, Dejan [2 ]
Thorpe, Matthew [3 ,4 ]
机构
[1] Univ Minnesota, Sch Math, Minneapolis, MN 55455 USA
[2] Carnegie Mellon Univ, Dept Math Sci, Pittsburgh, PA USA
[3] Univ Manchester, Dept Math, Manchester, England
[4] Alan Turing Inst, London NW1 2DB, England
基金
美国国家科学基金会; 欧洲研究理事会;
关键词
Semi-supervised learning; Regression; Asymptotic consistency; Gamma-convergence; PDEs on graphs; Non-local variational problems; Random walks on graphs; CONSISTENCY; GRAPH; REGULARIZATION;
D O I
10.1007/s40687-022-00371-x
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
We investigate graph-based Laplacian semi-supervised learning at low labeling rates (ratios of labeled to total number of data points) and establish a threshold for the learning to be well posed. Laplacian learning uses harmonic extension on a graph to propagate labels. It is known that when the number of labeled data points is finite while the number of unlabeled data points tends to infinity, the Laplacian learning becomes degenerate and the solutions become roughly constant with a spike at each labeled data point. In this work, we allow the number of labeled data points to grow to infinity as the total number of data points grows. We show that for a random geometric graph with length scale epsilon > 0, if the labeling rate / << epsilon(2), then the solution becomes degenerate and spikes form. On the other hand, if beta >> epsilon(2), then Laplacian learning is well-posed and consistent with a continuum Laplace equation. Furthermore, in the well-posed setting we prove quantitative error estimates of O( epsilon beta-(1/2)) for the difference between the solutions of the discrete problem and continuum PDE, up to logarithmic factors. We also study p-Laplacian regularization and show the same degeneracy result when / << epsilon( p). The proofs of our well-posedness results use the random walk interpretation of Laplacian learning and PDE arguments, while the proofs of the ill-posedness results use gamma -convergence tools from the calculus of variations. We also present numerical results on synthetic and real data to illustrate our results.
引用
收藏
页数:42
相关论文
共 50 条
  • [21] Overcoming the curse of dimensionality with Laplacian regularization in semi-supervised learning
    Cabannes, Vivien
    Pillaud-Vivien, Loucas
    Bach, Francis
    Rudi, Alessandro
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [22] Properly-Weighted Graph Laplacian for Semi-supervised Learning
    Jeff Calder
    Dejan Slepčev
    Applied Mathematics & Optimization, 2020, 82 : 1111 - 1159
  • [23] Laplacian Welsch Regularization for Robust Semi-supervised Dictionary Learning
    Ke, Jingchen
    Gong, Chen
    Zhao, Lin
    INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING: BIG DATA AND MACHINE LEARNING, PT II, 2019, 11936 : 40 - 52
  • [24] A semi-supervised learning algorithm via adaptive Laplacian graph
    Yuan, Yuan
    Li, Xin
    Wang, Qi
    Nie, Feiping
    NEUROCOMPUTING, 2021, 426 : 162 - 173
  • [25] Semi-supervised learning through adaptive Laplacian graph trimming
    Yue, Zongsheng
    Meng, Deyu
    He, Juan
    Zhang, Gemeng
    IMAGE AND VISION COMPUTING, 2017, 60 : 38 - 47
  • [26] Semi-supervised learning with Deep Laplacian Support Vector Machine
    Chen, Hangyu
    Xie, Xijiong
    Li, Di
    PATTERN ANALYSIS AND APPLICATIONS, 2025, 28 (01)
  • [27] An improved Laplacian semi-supervised regression
    Kraus, Vivien
    Benkabou, Seif-Eddine
    Benabdeslem, Khalid
    Cherqui, Frederic
    2018 IEEE 30TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI), 2018, : 564 - 570
  • [28] A Semi-Supervised Intelligent Fault Diagnosis Method for Bearings Under Low Labeled Rates
    Ye, Tianyi
    Yuan, Xianfeng
    Yang, Xilin
    Song, Yong
    Zhang, Zhihang
    Zhou, Fengyu
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [29] Instance labeling in semi-supervised learning with meaning values of words
    Altinel, Berna
    Ganiz, Murat Can
    Diri, Banu
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 62 : 152 - 163
  • [30] FlexMatch: Boosting Semi-Supervised Learning with Curriculum Pseudo Labeling
    Zhang, Bowen
    Wang, Yidong
    Hou, Wenxin
    Wu, Hao
    Wang, Jindong
    Okumura, Manabu
    Shinozaki, Takahiro
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34