The -th
moment about the
mean (or -th
central moment) of a real-valued
random variable is the quantity , where E is the
expectation operator. For a
continuous univariate probability distribution with
probability density function , the -th moment about the mean is \mu_n = \operatorname{E} \left[ {\left( X - \operatorname{E}[X] \right)}^n \right] = \int_{-\infty}^{+\infty} (x - \mu)^n f(x)\,\mathrm{d} x. For random variables that have no mean, such as the
Cauchy distribution, central moments are not defined. The first few central moments have intuitive interpretations: • The "zeroth" central moment is 1. • The first central moment is 0 (not to be confused with the first
raw moment or the
expected value ). • The second central moment is called the
variance, and is usually denoted , where represents the
standard deviation. • The third and fourth central moments are used to define the
standardized moments which are used to define
skewness and
kurtosis, respectively.
Properties For all , the -th central moment is
homogeneous of degree : \mu_n(cX) = c^n \mu_n(X).\,
Only for such that n equals 1, 2, or 3 do we have an additivity property for random variables and that are
independent: \mu_n(X+Y) = \mu_n(X)+\mu_n(Y)\, provided
n ∈ {{math|{1, 2, 3}}}. A related functional that shares the translation-invariance and homogeneity properties with the -th central moment, but continues to have this additivity property even when is the -th
cumulant . For , the -th cumulant is just the
expected value; for = either 2 or 3, the -th cumulant is just the -th central moment; for , the -th cumulant is an -th-degree monic polynomial in the first moments (about zero), and is also a (simpler) -th-degree polynomial in the first central moments.
Relation to moments about the origin Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the -th-order moment about the origin to the moment about the mean is \mu_n = \operatorname{E}\left[\left(X - \operatorname{E}[X]\right)^n\right] = \sum_{j=0}^n \binom{n}{j} {\left(-1\right)}^{n-j} \mu'_j \mu^{n-j}, where is the mean of the distribution, and the moment about the origin is given by \mu'_m = \int_{-\infty}^{+\infty} x^m f(x)\,dx = \operatorname{E}[X^m] = \sum_{j=0}^m \binom{m}{j} \mu_j \mu^{m-j}. For the cases — which are of most interest because of the relations to
variance,
skewness, and
kurtosis, respectively — this formula becomes (noting that \mu = \mu'_1 and \mu'_0=1): \mu_2 = \mu'_2 - \mu^2\, which is commonly referred to as \operatorname{Var}(X) = \operatorname{E}[X^2] - \left(\operatorname{E}[X]\right)^2 \begin{align} \mu_3 &= \mu'_3 - 3 \mu \mu'_2 +2 \mu^3 \\ \mu_4 &= \mu'_4 - 4 \mu \mu'_3 + 6 \mu^2 \mu'_2 - 3 \mu^4. \end{align} ... and so on, following
Pascal's triangle, i.e. \mu_5 = \mu'_5 - 5 \mu \mu'_4 + 10 \mu^2 \mu'_3 - 10 \mu^3 \mu'_2 + 4 \mu^5.\, because The following sum is a stochastic variable having a
compound distribution W = \sum_{i=1}^M Y_i, where the Y_i are mutually independent random variables sharing the same common distribution and M a random integer variable independent of the Y_k with its own distribution. The moments of W are obtained as \operatorname{E}[W^n]= \sum_{i=0}^n\operatorname{E}\left[\binom{M}{i}\right] \sum_{j=0}^i \binom{i}{j} {\left(-1\right)}^{i-j} \operatorname{E} \left[ \left(\sum_{k=1}^j Y_k\right)^n \right], where \operatorname{E} \left[ {\left(\sum_{k=1}^j Y_k\right)}^n\right] is defined as zero for j = 0.
Symmetric distributions In distributions that are
symmetric about their means (unaffected by being
reflected about the mean), all odd central moments equal zero whenever they exist, because in the formula for the -th moment, each term involving a value of less than the mean by a certain amount exactly cancels out the term involving a value of greater than the mean by the same amount. ==Multivariate moments==