共 50 条
On learning monotone Boolean functions under the uniform distribution
被引:6
|作者:
Amano, K
[1
]
Maruoka, A
[1
]
机构:
[1] Tohoku Univ, Grad Sch Informat Sci, Sendai, Miyagi 9808579, Japan
关键词:
PAC learning;
monotone Boolean functions;
harmonic analysis;
majority function;
D O I:
10.1016/j.tcs.2005.10.012
中图分类号:
TP301 [理论、方法];
学科分类号:
081202 ;
摘要:
In this paper, we prove two general theorems on monotone Boolean functions which are useful for constructing a learning algorithm for monotone Boolean functions under the uniform distribution. A monotone Boolean function is called fair if it takes the value 1 on exactly half of its inputs. The first result proved in this paper is that a single variable function f (x) = x(i) has the minimum correlation with the majority function among all fair monotone functions. This proves the conjecture by Blum et al. (1998, Proc. 39th FOCS, pp. 408-415) and improves the performance guarantee of the best known learning algorithm for monotone Boolean functions under the uniform distribution they proposed. Our second result is on the relationship between the influences and the average sensitivity of a monotone Boolean function. The influence of variable x(i) on f is defined as the probability that f (x) differs from f (x circle plus e(i)) where x is chosen uniformly from (0, 1)(n) and x (circle plus) e(i) means x with its ith bit flipped. The average sensitivity of f is defined as the sum of the influences over all variables x(i). We prove that a somewhat unintuitive result which says if the influence of every variable on a monotone Boolean function is small, i.e., O(1/n(c)) for some constant c > 0, then the average sensitivity of the function must be large, i.e., Omega(log n). We also discuss how to apply this result to the construction of a new learning algorithm for monotone Boolean functions. (c) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:3 / 12
页数:10
相关论文