Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas

被引:0
|
作者
Feldman, Vitaly [1 ]
Vondrak, Jan [1 ]
机构
[1] IBM Res Almaden, San Jose, CA 95120 USA
关键词
submodular; fractionally-subadditive; approximation; junta; PAC learning; testing; INEQUALITY; TREES; DNF;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We investigate the approximability of several classes of real-valued functions by functions of a small number of variables (juntas). Our main results are tight bounds on the number of variables required to approximate a function f : {0, 1}(n) -> [0, 1] within l(2)-error epsilon over the uniform distribution: If f is submodular, then it is epsilon-close to a function of O(1/epsilon(2) log 1/epsilon) variables. This is an exponential improvement over previously known results [1]. We note that Omega(1/epsilon(2)) variables are necessary even for linear functions. If f is fractionally subadditive (XOS) it is epsilon-close to a function of 2(O(1/epsilon 2)) variables. This result holds for all functions with low total l(1)-influence and is a real-valued analogue of Friedgut's theorem for boolean functions. We show that 2(Omega(1/epsilon)) variables are necessary even for XOS functions. As applications of these results, we provide learning algorithms over the uniform distribution. For XOS functions, we give a PAC learning algorithm that runs in time 2(1/poly(epsilon))poly(n). For submodular functions we give an algorithm in the more demanding PMAC learning model [2] which requires a multiplicative (1 + gamma) factor approximation with probability at least 1 - epsilon over the target distribution. Our uniform distribution algorithm runs in time 2(1/poly(gamma epsilon))poly(n). This is the first algorithm in the PMAC model that can achieve a constant approximation factor arbitrarily close to 1 for all submodular functions (even over the uniform distribution). It relies crucially on our approximation by junta result. As follows from the lower bounds in [1] both of these algorithms are close to optimal. We also give applications for proper learning, testing and agnostic learning with value queries of these classes.
引用
收藏
页码:386 / 395
页数:10
相关论文
共 50 条
  • [31] Distributed Maximization of Submodular and Approximately Submodular Functions
    Ye, Lintao
    Sundaram, Shreyas
    2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2020, : 2979 - 2984
  • [32] Maximizing Submodular Functions under Submodular Constraints
    Padmanabhan, Madhavan R.
    Zhu, Yanhui
    Basu, Samik
    Pavan, A.
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 1618 - 1627
  • [33] Minimization of locally defined submodular functions by optimal soft arc consistency
    Cooper, Martin C.
    CONSTRAINTS, 2008, 13 (04) : 437 - 458
  • [34] Minimization of Locally Defined Submodular Functions by Optimal Soft Arc Consistency
    Martin C. Cooper
    Constraints, 2008, 13 : 437 - 458
  • [35] BEST ERROR BOUNDS FOR APPROXIMATION BY PIECEWISE POLYNOMIAL FUNCTIONS
    WIDLUND, O
    NUMERISCHE MATHEMATIK, 1977, 27 (03) : 327 - 338
  • [36] Some Upper Bounds for RKHS Approximation by Bessel Functions
    Tian, Mingdang
    Sheng, Baohuai
    Wang, Shuhua
    AXIOMS, 2022, 11 (05)
  • [37] Exact bounds for some basis functions of approximation operators
    Zeng, XM
    Zhao, JN
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2001, 6 (05) : 563 - 575
  • [38] OPTIMAL BOUNDS IN NORMAL APPROXIMATION FOR MANY INTERACTING WORLDS
    Chen, Louis H. Y.
    Thanh, Le Van
    ANNALS OF APPLIED PROBABILITY, 2023, 33 (02): : 625 - 642
  • [39] CONNECTIVITY OF SUBMODULAR FUNCTIONS
    OXLEY, J
    WHITTLE, G
    DISCRETE MATHEMATICS, 1992, 105 (1-3) : 173 - 184
  • [40] Optimal approximation by piecewise constant functions
    Tamanini, I
    VARIATIONAL METHODS FOR DISCONTINUOUS STRUCTURES: APPLICATIONS TO IMAGE SEGMENTATION, CONTINUUM MECHANICS, HOMOGENIZATION, 1996, 25 : 73 - 85