Let X be a
random variable with
CDF F_X. The moment generating function (mgf) of X (or F_X), denoted by M_X(t), is M_X(t) = \operatorname E \left[e^{tX}\right] provided this
expectation exists for t in some open
neighborhood of 0. That is, there is an h > 0 such that for all t satisfying -h , \operatorname E \left[e^{tX}\right] exists. If the expectation does not exist in an open neighborhood of 0, we say that the moment generating function does not exist. In other words, the moment generating function of is the
expectation of the random variable e^{tX}. More generally, when \mathbf X = ( X_1, \ldots, X_n)^{\mathrm{T}}, an n-dimensional
random vector, and \mathbf t is a fixed vector, one uses \mathbf t \cdot \mathbf X = \mathbf t^\mathrm T\mathbf X instead of M_{\mathbf X}(\mathbf t) := \operatorname E \left[e^{\mathbf t^\mathrm T\mathbf X}\right]. M_X(0) always exists and is equal to 1. However, a key problem with moment generating functions is that moments and the moment generating function may not exist, as the integrals need not converge absolutely. By contrast, the
characteristic function or Fourier transform always exists (because it is the integral of a bounded function on a space of finite
measure), and for some purposes may be used instead. The moment generating function is so named because it can be used to find the moments of the distribution. The series expansion of e^{tX} is e^{t X} = 1 + t X + \frac{t^2 X^2}{2!} + \frac{t^3 X^3}{3!} + \cdots + \frac{t^n X^n}{n!} + \cdots. Hence, \begin{align} M_X(t) &= \operatorname E [e^{t X}] \\[1ex] &= 1 + t \operatorname E[X] + \frac{t^2 \operatorname E[X^2]}{2!} + \frac{t^3 \operatorname E[X^3]}{3!} + \cdots + \frac{t^n\operatorname E [X^n]}{n!}+\cdots \\[1ex] & = 1 + t m_1 + \frac{t^2 m_2}{2!} + \frac{t^3 m_3}{3!} + \cdots + \frac{t^n m_n}{n!} + \cdots, \end{align} where m_n is the
moment. Differentiating M_X(t) i times with respect to t and setting t = 0, we obtain the i-th moment about the origin, m_i. If X is a continuous random variable, the following relation between its moment generating function M_X(t) and the
two-sided Laplace transform of its probability density function f_X(x) holds: M_X(t) = \mathcal{L}\{f_X\}(-t), since the PDF's two-sided Laplace transform is given as \mathcal{L}\{f_X\}(s) = \int_{-\infty}^\infty e^{-sx} f_X(x)\, dx, and the moment generating function's definition expands (by the
law of the unconscious statistician) to M_X(t) = \operatorname E \left[e^{tX}\right] = \int_{-\infty}^\infty e^{tx} f_X(x)\, dx. This is consistent with the characteristic function of X being a
Wick rotation of M_X(t) when the moment generating function exists, as the characteristic function of a continuous random variable X is the
Fourier transform of its probability density function f_X(x), and in general when a function f(x) is of
exponential order, the Fourier transform of f is a Wick rotation of its two-sided Laplace transform in the region of convergence. See
the relation of the Fourier and Laplace transforms for further information. ==Examples==