Noise sensitivity

cosmos 4th July 2018 at 2:57pm
Sensitivity

Probability that output changes for a uniformly random input, when changing each bit with probability ϵ\epsilon (called noise rate). This is like the mutation model in the Wright-Fisher model. See this video

Upper bounds on noise sensitivity gives upper bounds on average sensitivity. The converse is true for degree d PTFs

This has applications to learning. Upper bound on noise sensitivity implies Fourier concentration, kind of smoothness.. And this is related to learning algorithms. Their results implies that the class of degree d PTFs is agnostic lernable in polynomial time.

Regularity lemma and pseudorandom generators for PTFsPRG. version of central limit theorem still holds for weakly indpedenent random variables. They show that these weakly dependent distributions (generated from PRGs) fools the class of linear threshold functions. Meaning that the expectation of the linear threshold function when inputs are under this pseudorandom distribution, this expectation is epsilon-close to the expectation when inputs are truly unfiromly distributed (see here). They then use this to show that CLT also holds for these weakly dependent PR distributions. Results of this kind are called Derandomization.

These PR distributions have small support (seeds are much smaller than output space).

Also related quantity is the noise stability (def)

Fourier formula for noise stability