Biased Halfspaces, Noise Sensitivity, and Local Chernoff Inequalities

被引:0
|
作者
Keller, Nathan [1 ]
Klein, Ohad [1 ]
机构
[1] Bar Ilan Univ, Dept Math, Ramat Gan, Israel
基金
以色列科学基金会;
关键词
D O I
10.19086/da.10234
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
A halfspace is a function f :{-1,1}(n) -> {0,1} of the form f (x) = 1(a.x > t), where Sigma(i)a(i)(2) - 1. We show that if f is a halfspace with E[f] = epsilon and a' = max(i) broken vertical bar a(i)broken vertical bar , then the degree-1 Fourier weight of f isW(1) (f) = Theta (epsilon(2) log (1/epsilon)), and the maximal influence of f is I-max (f) = Theta(epsilon min (1,a'root log (1/epsilon))). These results, which determine the exact asymptotic order of W-1 (f) and I-max (f), provide sharp generalizations of theorems proved by Matulef, O'Donnell, Rubinfeld, and Servedio, and settle a conjecture posed by Kalai, Keller and Mossel. In addition, we present a refinement of the definition of noise sensitivity which takes into consideration the bias of the function, and show that (like in the unbiased case) halfspaces are noise resistant, and, in the other direction, any noise resistant function is well correlated with a halfspace. Our main tools are 'local' forms of the classical Chernoff inequality, like the following one proved by Devroye and Lugosi (2008): Let {x(i)} be independent random variables uniformly distributed in {-1,1}, and let a(i) epsilon R(>=)0 be such that Sigma(i)a(i)(2) = 1. If for some t >= 0 we have Pr [Sigma(i)a(i)x(i) > t] = epsilon, then Pr [Sigma(i) a(i)x(i) > t + delta] <= epsilon/2 holds for delta <= c/ root log (1/epsilon), where c is a universal constant.
引用
收藏
页数:50
相关论文
共 50 条
  • [1] Learning Halfspaces with Malicious Noise
    Klivans, Adam R.
    Long, Philip M.
    Servedio, Rocco A.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2009, 10 : 2715 - 2740
  • [2] Learning Halfspaces with Malicious Noise
    Klivans, Adam R.
    Long, Philip M.
    Servedio, Rocco A.
    [J]. AUTOMATA, LANGUAGES AND PROGRAMMING, PT I, 2009, 5555 : 609 - +
  • [3] Hardness of learning halfspaces with noise
    Guruswami, Venkatesan
    Raghavendra, Prasad
    [J]. 47TH ANNUAL IEEE SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE, PROCEEDINGS, 2006, : 543 - +
  • [4] HARDNESS OF LEARNING HALFSPACES WITH NOISE
    Guruswami, Venkatesan
    Raghavendra, Prasad
    [J]. SIAM JOURNAL ON COMPUTING, 2009, 39 (02) : 742 - 765
  • [5] Noise Sensitivity on the p-Biased Hypercube
    Lifshitz, Noam
    Minzer, Dor
    [J]. 2019 IEEE 60TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS 2019), 2019, : 1205 - 1226
  • [6] ON SOME INEQUALITIES OF CHERNOFF-TYPE
    RAO, BLSP
    [J]. THEORY OF PROBABILITY AND ITS APPLICATIONS, 1992, 37 (02) : 392 - 398
  • [7] Forster Decomposition and Learning Halfspaces with Noise
    Diakonikolas, Ilias
    Kane, Daniel M.
    Tzamos, Christos
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [8] Noise sensitivity of percolation via differential inequalities
    Tassion, Vincent
    Vanneuville, Hugo
    [J]. PROCEEDINGS OF THE LONDON MATHEMATICAL SOCIETY, 2023, 126 (04) : 1063 - 1091
  • [9] Boolean Functions with Biased Inputs: Approximation and Noise Sensitivity
    Heidari, Mohsen
    Pradhan, S. Sandeep
    Venkataramanan, Ramji
    [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1192 - 1196
  • [10] Efficiently Learning Halfspaces with Tsybakov Noise
    Diakonikolas, Ilias
    Kane, Daniel M.
    Kontonis, Vasilis
    Tzamos, Christos
    Zarifis, Nikos
    [J]. STOC '21: PROCEEDINGS OF THE 53RD ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2021, : 88 - 101