Elementary properties Let and be complex matrices and let and be arbitrary complex numbers. We denote the
identity matrix by and the
zero matrix by 0. The matrix exponential satisfies the following properties. We begin with the properties that are immediate consequences of the definition as a power series: • • , where denotes the
transpose of . • , where denotes the
conjugate transpose of . • If is
invertible then The next key result is this one: • If XY=YX then e^Xe^Y=e^{X+Y}. The proof of this identity is the same as the standard power-series argument for the corresponding identity for the exponential of real numbers. That is to say,
as long as X and Y commute, it makes no difference to the argument whether X and Y are numbers or matrices. It is important to note that this identity typically does not hold if X and Y do not commute (see
Golden-Thompson inequality below). Consequences of the preceding identity are the following: • • Using the above results, we can easily verify the following claims: • If is
symmetric then is also symmetric. • If is
skew-symmetric then is
orthogonal. • If is
Hermitian then is also Hermitian. • If is
skew-Hermitian then is
unitary. Finally, a
Laplace transform of matrix exponentials amounts to the
resolvent, \int_0^\infty e^{-ts}e^{tX}\,dt = (sI - X)^{-1} for all sufficiently large positive values of .
Linear differential equation systems One of the reasons for the importance of the matrix exponential is that it can be used to solve systems of linear
ordinary differential equations. The solution of \frac{d}{dt} y(t) = Ay(t), \quad y(0) = y_0, where is a constant matrix and
y is a column vector, is given by y(t) = e^{At} y_0. The matrix exponential can also be used to solve the inhomogeneous equation \frac{d}{dt} y(t) = Ay(t) + z(t), \quad y(0) = y_0. See the section on
applications below for examples. There is no closed-form solution for differential equations of the form \frac{d}{dt} y(t) = A(t) \, y(t), \quad y(0) = y_0, where is not constant, but the
Magnus series gives the solution as an infinite sum.
The determinant of the matrix exponential By
Jacobi's formula, for any complex square matrix the following
trace identity holds: In addition to providing a computational tool, this formula demonstrates that a matrix exponential is always an
invertible matrix. This follows from the fact that the right hand side of the above equation is always non-zero, and so , which implies that must be invertible. In the real-valued case, the formula also exhibits the map \exp \colon M_n(\R) \to \mathrm{GL}(n, \R) to not be
surjective, in contrast to the complex case mentioned earlier. This follows from the fact that, for real-valued matrices, the right-hand side of the formula is always positive, while there exist invertible matrices with a negative determinant.
Real symmetric matrices The matrix exponential of a real symmetric matrix is positive definite. Let S be an real symmetric matrix and x \in \R^n a column vector. Using the elementary properties of the matrix exponential and of symmetric matrices, we have: x^Te^Sx=x^Te^{S/2}e^{S/2}x=x^T(e^{S/2})^Te^{S/2}x =(e^{S/2}x)^Te^{S/2}x=\lVert e^{S/2}x\rVert^2\geq 0. Since e^{S/2} is invertible, the equality only holds for x=0, and we have x^Te^Sx > 0 for all non-zero x. Hence e^S is positive definite.
Tensor product of exponential The exponential of the
Kronecker sum\overline{\oplus } of two square matrices A, B which must not be confused with the
direct sum takes a simple form. A \,{\overline {\oplus }}\, B = A \otimes \mathrm{I} _{m}+\mathrm{I} _{n}\otimes B In this case the exponential is simply the
tensor product \otimes of the exponentials of the matrices: \exp{\left(A \,{\overline {\oplus }}\, B \right)}= \exp{A}\,\otimes\,\exp{B} :Here we assumed A, B to be of order n, m respectively and \mathrm{I} _{k} is the
Identity matrix of order k. :This follows from the commutation of the summands of the Kronecker sum and the properties discussed above. This result is tied to the Direct Product of
Lie groups and its associated Lie algebra for which [A \,{\overline {\oplus }}\, B, C \,{\overline {\oplus }}\, D] = [A, C] \,{\overline {\oplus }}\, [B, D] as a representation of the Direct sum of
Lie algebras. Another application of this formula is the physics of non-interacting systems. Its reverse formula \log{A}\,{\overline {\oplus }}\,\log{B} = \log{\left(A \otimes B \right)} leads to the additivity of the
von Neumann entropy for indepent systems if these logarithmic expressions exist. == The exponential of sums ==