If we start from the simple
Gaussian function p(x) = e^{-x^2/2}, \quad x\in(-\infty,\infty) we have the corresponding
Gaussian integral \int_{-\infty}^\infty p(x) \, dx = \int_{-\infty}^\infty e^{-x^2/2} \, dx = \sqrt{2\pi\,}, Now if we use the latter's
reciprocal value as a normalizing constant for the former, defining a function \varphi(x) as \varphi(x) = \frac{1}{\sqrt{2\pi\,}} p(x) = \frac{1}{\sqrt{2\pi\,}} e^{-x^2/2} so that its
integral is unit \int_{-\infty}^\infty \varphi(x) \, dx = \int_{-\infty}^\infty \frac{1}{\sqrt{2\pi\,}} e^{-x^2/2} \, dx = 1 then the function \varphi(x) is a probability density function. This is the density of the standard
normal distribution. (
Standard, in this case, means the
expected value is 0 and the
variance is 1.) And constant \frac{1}{\sqrt{2\pi}} is the
normalizing constant of function p(x). Similarly, \sum_{n=0}^\infty \frac{\lambda^n}{n!} = e^{\lambda} , and consequently f(n) = \frac{\lambda^n e^{-\lambda}}{n!} is a probability mass function on the set of all nonnegative integers. This is the probability mass function of the
Poisson distribution with expected value λ. Note that if the probability density function is a function of various parameters, so too will be its normalizing constant. The parametrised normalizing constant for the
Boltzmann distribution plays a central role in
statistical mechanics. In that context, the normalizing constant is called the
partition function. ==Bayes' theorem==