Uniform convergence rates for Lipschitz learning on graphs

被引:8
|
作者
Bungert, Leon [1 ]
Calder, Jeff [2 ]
Roith, Tim [3 ]
机构
[1] Univ Bonn, Hausdoiff Ctr Math, Endenicher Allee 62, D-53115 Bonn, Germany
[2] Univ Minnesota, Sch Math, 127 Vincent Hall,206 Church St SE, Minneapolis, MN 55455 USA
[3] Univ Erlangen Nurnberg, Dept Math, Cauerstr 11, D-91058 Erlangen, Germany
基金
美国国家科学基金会;
关键词
Lipschitz learning; graph-based semisupervised learning; continuum limit; absolutely minimizing Lipschitz extensions; infinity Laplacian; TUG-OF-WAR; P-LAPLACIAN; INFINITY-LAPLACIAN; CONSISTENCY; EXTENSIONS;
D O I
10.1093/imanum/drac048
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Lipschitz learning is a graph-based semisupervised learning method where one extends labels from a labeled to an unlabeled data set by solving the infinity Laplace equation on a weighted graph. In this work we prove uniform convergence rates for solutions of the graph infinity Laplace equation as the number of vertices grows to infinity. Their continuum limits are absolutely minimizing Lipschitz extensions (AMLEs) with respect to the geodesic metric of the domain where the graph vertices are sampled from. We work under very general assumptions on the graph weights, the set of labeled vertices and the continuum domain. Our main contribution is that we obtain quantitative convergence rates even for very sparsely connected graphs, as they typically appear in applications like semisupervised learning. In particular, our framework allows for graph bandwidths down to the connectivity radius. For proving this we first show a quantitative convergence statement for graph distance functions to geodesic distance functions in the continuum. Using the 'comparison with distance functions' principle, we can pass these convergence statements to infinity harmonic functions and AMLEs.
引用
收藏
页码:2445 / 2495
页数:51
相关论文
共 50 条
  • [1] Uniform Convergence with Square-Root Lipschitz Loss
    Zhou, Lijia
    Dai, Zhen
    Koehler, Frederic
    Srebro, Nathan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Continuum Limit of Lipschitz Learning on Graphs
    Roith, Tim
    Bungert, Leon
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2023, 23 (02) : 393 - 431
  • [3] Continuum Limit of Lipschitz Learning on Graphs
    Tim Roith
    Leon Bungert
    [J]. Foundations of Computational Mathematics, 2023, 23 : 393 - 431
  • [4] RATES OF UNIFORM CONVERGENCE FOR RIEMANN INTEGRALS
    Alewine, J. Alan
    [J]. MISSOURI JOURNAL OF MATHEMATICAL SCIENCES, 2014, 26 (01) : 48 - 56
  • [5] Uniform convergence rates for halfspace depth
    Burr, Michael A.
    Fabrizio, Robert J.
    [J]. STATISTICS & PROBABILITY LETTERS, 2017, 124 : 33 - 40
  • [6] Boundedness and convergence for singular integrals of measures separated by Lipschitz graphs
    Chousionis, Vasilis
    Mattila, Pertti
    [J]. BULLETIN OF THE LONDON MATHEMATICAL SOCIETY, 2010, 42 : 109 - 118
  • [7] A note on "Convergence rates and asymptotic normality for series estimators": uniform convergence rates
    de Jong, RM
    [J]. JOURNAL OF ECONOMETRICS, 2002, 111 (01) : 1 - 9
  • [8] Lipschitz Adaptivity with Multiple Learning Rates in Online Learning
    Mhammedi, Zakaria
    Koolen, Wouter M.
    van Erven, Tim
    [J]. CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [9] LipGene: Lipschitz Continuity Guided Adaptive Learning Rates for Fast Convergence on Microarray Expression Data Sets
    Prashanth, Tejas
    Saha, Snehanshu
    Basarkod, Sumedh
    Aralihalli, Suraj
    Dhavala, Soma S.
    Saha, Sriparna
    Aduri, Raviprasad
    [J]. IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2022, 19 (06) : 3553 - 3563
  • [10] Function spaces in Lipschitz domains and optimal rates of convergence for sampling
    Novak, E
    Triebel, H
    [J]. CONSTRUCTIVE APPROXIMATION, 2006, 23 (03) : 325 - 350