The
joint cumulant of several random variables is defined as the coefficient in the Maclaurin series of the multivariate cumulant generating function, see Section 3.1 in, G(t_1,\dots,t_n)=\log \mathrm{E}(\mathrm{e}^{\sum_{j=1}^n t_j X_j}) =\sum_{k_1,\ldots,k_n} \kappa_{k_1,\ldots,k_n} \frac{t_1^{k_1} \cdots t_n^{k_n}}{k_1! \cdots k_n!} \,. Note that \kappa_{k_1,\dots,k_n} = \left.\left(\frac{\mathrm{d}}{\mathrm{d} t_1}\right)^{k_1} \cdots \left(\frac{\mathrm{d}}{\mathrm{d} t_n}\right)^{k_n} G(t_1,\dots,t_n) \right|_{t_1 = \dots = t_n = 0}\,, and, in particular \kappa(X_1,\ldots,X_n) = \left. \frac{\mathrm{d}^n}{\mathrm{d} t_1 \cdots \mathrm{d} t_n} G(t_1,\dots,t_n) \right|_{t_1 = \dots = t_n = 0}\,. As with a single variable, the generating function and cumulant can instead be defined via H(t_1,\dots,t_n) =\log \mathrm{E}(\mathrm{e}^{\sum_{j=1}^n i t_j X_j}) =\sum_{k_1,\ldots,k_n} \kappa_{k_1,\ldots,k_n} i^{k_1+\cdots+k_n} \frac{t_1^{k_1} \cdots t_n^{k_n}}{k_1! \cdots k_n!}\,, in which case \kappa_{k_1,\dots,k_n} = (-i)^{k_1+\cdots+k_n} \left.\left(\frac{\mathrm{d}}{\mathrm{d} t_1}\right)^{k_1} \cdots \left(\frac{\mathrm{d}}{\mathrm{d} t_n}\right)^{k_n} H(t_1,\dots,t_n) \right|_{t_1 = \dots = t_n = 0}\,, and \kappa(X_1,\ldots,X_n) = \left. (-i)^{n} \frac{\mathrm{d}^n}{\mathrm{d} t_1 \cdots \mathrm{d} t_n} H(t_1,\dots,t_n) \right|_{t_1 = \dots = t_n = 0}\,.
Repeated random variables and relation between the coefficients κk1, ..., kn Observe that \kappa_{k_1,\dots,k_n} (X_1,\ldots,X_n) can also be written as \kappa_{k_1,\dots,k_n} = \left. \frac{\mathrm{d}^{k_1}}{\mathrm{d} t_{1,1} \cdots \mathrm{d} t_{1,k_1}} \cdots \frac{\mathrm{d}^{k_n}}{\mathrm{d} t_{n,1} \cdots \mathrm{d} t_{n,k_n}} G\left(\sum_{j=1}^{k_1}t_{1,j},\dots,\sum_{j=1}^{k_n}t_{n,j}\right) \right|_{t_{i,j}=0}, from which we conclude that \kappa_{k_1,\dots,k_n} (X_1,\ldots,X_n) = \kappa_{1,\ldots,1} ( \underbrace{X_1,\dots,X_1}_{k_1}, \ldots , \underbrace{X_n,\dots,X_n}_{k_n} ) . For example \kappa_{2,0,1}(X,Y,Z) = \kappa(X,X,Z),\, and \kappa_{0,0,n,0}(X,Y,Z,T) = \kappa_{n}(Z) = \kappa(\underbrace{Z,\dots,Z}_{n}) .\, In particular, the last equality shows that the cumulants of a single random variable are the joint cumulants of multiple copies of that random variable.
Relation with mixed moments The joint cumulant of random variables can be expressed as an alternate sum of products of their
mixed moments, see Equation (3.2.7) in, \kappa(X_1,\dots,X_n)=\sum_\pi \kappa(\kappa(X_{\pi_1}\mid Y), \dots, \kappa(X_{\pi_b}\mid Y)) where • the sum is over all
partitions of the set of indices, and • 1, ..., b are all of the "blocks" of the partition ; the expression indicates that the joint cumulant of the random variables whose indices are in that block of the partition.
Conditional cumulants and the conditional expectation For certain settings, a derivative identity can be established between the conditional cumulant and the
conditional expectation. For example, suppose that where is standard normal independent of , then for any it holds that \kappa_{n+1}(X\mid Y=y) = \frac{ \mathrm{d}^n}{ \mathrm{d} y^n}\operatorname E(X\mid Y = y), \, n \in \mathbb{N}, \, y \in \mathbb{R}. The results can also be extended to the exponential family. == Relation to statistical physics ==