The
information entropy of the wrapped normal distribution is defined as: : H = -\int_\Gamma f_\text{WN}(\theta;\mu,\sigma)\,\ln(f_\text{WN}(\theta;\mu,\sigma))\,d\theta where \Gamma is any interval of length 2\pi. Defining z=e^{i(\theta-\mu)} and q=e^{-\sigma^2}, the
Jacobi triple product representation for the wrapped normal is: : f_\text{WN}(\theta;\mu,\sigma) = \frac{\phi(q)}{2\pi}\prod_{m=1}^\infty (1+q^{m-1/2}z)(1+q^{m-1/2}z^{-1}) where \phi(q)\, is the
Euler function. The logarithm of the density of the wrapped normal distribution may be written: : \ln(f_\text{WN}(\theta;\mu,\sigma))= \ln\left(\frac{\phi(q)}{2\pi}\right)+\sum_{m=1}^\infty\ln(1+q^{m-1/2}z)+\sum_{m=1}^\infty\ln(1+q^{m-1/2}z^{-1}) Using the series expansion for the logarithm: : \ln(1+x)=-\sum_{k=1}^\infty \frac{(-1)^k}{k}\,x^k the logarithmic sums may be written as: : \sum_{m=1}^\infty\ln(1+q^{m-1/2}z^{\pm 1})=-\sum_{m=1}^\infty \sum_{k=1}^\infty \frac{(-1)^k}{k}\,q^{mk-k/2}z^{\pm k} = -\sum_{k=1}^\infty \frac{(-1)^k}{k}\,\frac{q^{k/2}}{1-q^k}\,z^{\pm k} so that the logarithm of density of the wrapped normal distribution may be written as: : \ln(f_\text{WN}(\theta;\mu,\sigma))=\ln\left(\frac{\phi(q)}{2\pi}\right)-\sum_{k=1}^\infty \frac{(-1)^k}{k} \frac{q^{k/2}}{1-q^k}\,(z^k+z^{-k}) which is essentially a
Fourier series in \theta\,. Using the characteristic function representation for the wrapped normal distribution in the left side of the integral: : f_\text{WN}(\theta;\mu,\sigma) =\frac{1}{2\pi}\sum_{n=-\infty}^\infty q^{n^2/2}\,z^n the entropy may be written: : H = -\ln\left(\frac{\phi(q)}{2\pi}\right)+\frac{1}{2\pi}\int_\Gamma \left( \sum_{n=-\infty}^\infty\sum_{k=1}^\infty \frac{(-1)^k}{k} \frac{q^{(n^2+k)/2}}{1-q^k}\left(z^{n+k}+z^{n-k}\right) \right)\,d\theta which may be integrated to yield: : H = -\ln\left(\frac{\phi(q)}{2\pi}\right)+2\sum_{k=1}^\infty \frac{(-1)^k}{k}\, \frac{q^{(k^2+k)/2}}{1-q^k} == See also ==