In the study of Boolean functions, the hypercontractive inequality enables one to bound from above the norm of $Tf$ by some norm of $f$, where $T$ is the noise operator depending on the noise parameter. This can be written in terms of the Fourier transform of $f$ as (in a special case of $q=2$):

However, this involves all Fourier levels of $f$. In the application I have in mind, I'm interested only in bounding the norm of the first level of $f$, i.e. restricting the sum on the left to $|S| = 1$. Is it possible to give any inequality of this kind, probably with a different right hand side (but still involving some information about the norm of $f$) and some additional assumptions on $f$? If it's impossible for some trivial reasons, let me know anyway.

Here I'd be mostly interested in matrix-valued Boolean functions (see for example http://arxiv.org/abs/0705.3806), although any answer would be appreciated.

1 Answer
1

If f is boolean-valued and all of its degree-1 coefficients are small then the weight at level 1 is not more than $2/\pi$;

If f is boolean-valued and has very small variance $\alpha$ then the weight at level 1 is at most $2 \alpha \log_2(1/\alpha)$. ["Chang's Lemma", or "Talagrand's Lemma"]

For functions that are not boolean-valued, I don't have a lot to say; the main thing I can suggest is taking $p$ in the hypercontractive inequality as you stated it very close to $1$; if it is, say, $1+\epsilon$ then the LHS will have $\widehat{f}(\emptyset)^2$ (which usually you have information about), plus $\epsilon$ times the weight at level 1, plus at most $\epsilon^2$ times the $2$-norm (neglectable if $\epsilon$ is small enough). So this may allow you to "isolate" the level-1 weight after subtracting $\widehat{f}(\emptyset)^2$ and dividing by $\epsilon$.