The proof of this tight inequality depends on the so-called '
(q
, p
)-norm' of the Fourier transformation. (Establishing this norm is the most difficult part of the proof.) From this norm, one is able to establish a lower bound on the sum of the (differential)
Rényi entropies, , where , which generalize the Shannon entropies. For simplicity, we consider this inequality only in one dimension; the extension to multiple dimensions is straightforward and can be found in the literature cited.
Babenko–Beckner inequality The '
(q
, p
)-norm' of the Fourier transform is defined to be :\|\mathcal F\|_{q,p} = \sup_{f\in L^p(\mathbb R)} \frac{\|\mathcal Ff\|_q}{\|f\|_p}, where 1 and \frac 1 p + \frac 1 q = 1. In 1961, Babenko found this norm for
even integer values of
q. Finally, in 1975, using
Hermite functions as eigenfunctions of the Fourier transform, Beckner Let g=\mathcal Ff, \, 2\alpha=p, \, 2\beta=q, so that \frac1\alpha+\frac1\beta=2 and \frac12\le\alpha\le1\le\beta, we have :\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right)^{1/2\beta} \le \frac{(2\alpha)^{1/4\alpha}}{(2\beta)^{1/4\beta}} \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right)^{1/2\alpha}. Squaring both sides and taking the logarithm, we get :\frac 1\beta \log\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right) \le \frac 1 2 \log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}} + \frac 1\alpha \log \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right). We can rewrite the condition on \alpha, \beta as :\alpha(1-\beta)+\beta(1-\alpha)=0 Assume \alpha,\beta\ne1, then we multiply both sides by the negative :\frac{\beta}{1-\beta}=-\frac{\alpha}{1-\alpha} to get :\frac {1}{1-\beta} \log\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right) \ge \frac\alpha{2(\alpha-1)}\log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}} - \frac{1}{1-\alpha} \log \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right) ~. Rearranging terms yields an inequality in terms of the sum of Rényi entropies, :\frac{1}{1-\alpha} \log \left(\int_{\mathbb R} |f(x)|^{2\alpha}\,dx\right) + \frac {1}{1-\beta} \log\left(\int_{\mathbb R} |g(y)|^{2\beta}\,dy\right) \ge \frac\alpha{2(\alpha-1)}\log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}}; : H_\alpha(|f|^2) + H_\beta(|g|^2) \ge \frac 1 2 \left(\frac{\log\alpha}{\alpha-1}+\frac{\log\beta}{\beta-1}\right) - \log 2
Right-hand side :\frac\alpha{2(\alpha-1)}\log\frac{(2\alpha)^{1/\alpha}}{(2\beta)^{1/\beta}} :=\frac12\left[\frac{\alpha}{\alpha-1}\log(2\alpha)^{1/\alpha} + \frac{\beta}{\beta-1}\log(2\beta)^{1/\beta}\right] :=\frac12\left[\frac{\log2\alpha}{\alpha-1} + \frac{\log2\beta}{\beta-1}\right] :=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] + \frac12\log2\left[\frac{1}{\alpha-1} + \frac{1}{\beta-1}\right] :=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] + \frac12\log2\left[\frac{1}{\alpha-1} + \frac{1}{\beta-1} - \frac{\alpha}{\alpha-1} - \frac{\beta}{\beta-1}\right] :=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] + \frac12\log2\left[-2\right] :=\frac12\left[\frac{\log\alpha}{\alpha-1} + \frac{\log\beta}{\beta-1}\right] - \log2
Shannon entropy bound Taking the limit of this last inequality as \alpha, \, \beta \to 1 and the substitutions \Alpha=\alpha-1, \Beta=\beta-1 yields the less general Shannon entropy inequality, :H(|f|^2) + H(|g|^2) \ge \log\frac e 2,\quad\textrm{where}\quad g(y) \approx \int_{\mathbb R} e^{-2\pi ixy}f(x)\,dx~, valid for any base of logarithm, as long as we choose an appropriate unit of information,
bit,
nat, etc. The constant will be different, though, for a different normalization of the Fourier transform, (such as is usually used in physics, with normalizations chosen so that
ħ=1 ), i.e., :H(|f|^2) + H(|g|^2) \ge \log(\pi e)\quad\textrm{for}\quad g(y) \approx \frac 1{\sqrt{2\pi}}\int_{\mathbb R} e^{-ixy}f(x)\,dx~. In this case, the dilation of the Fourier transform absolute squared by a factor of 2 simply adds log(2) to its entropy. ==Entropy versus variance bounds==