Via matrix computation Theorem. (Jacobi's formula) For any differentiable map
A from the real numbers to
n ×
n matrices, : d \det (A) = \operatorname{tr} (\operatorname{adj}(A) \, dA).
Proof. Laplace's formula for the determinant of a matrix
A can be stated as :\det(A) = \sum_j A_{ij} \operatorname{adj}^{\rm T} (A)_{ij}. Notice that the summation is performed over some arbitrary row
i of the matrix. The determinant of
A can be considered to be a function of the elements of
A: :\det(A) = F\,(A_{11}, A_{12}, \ldots , A_{21}, A_{22}, \ldots , A_{nn}) so that, by the
chain rule, its differential is :d \det(A) = \sum_i \sum_j {\partial F \over \partial A_{ij}} \,dA_{ij}. This summation is performed over all
n×
n elements of the matrix. To find ∂
F/∂
Aij consider that on the right hand side of Laplace's formula, the index
i can be chosen at will. (In order to optimize calculations: Any other choice would eventually yield the same result, but it could be much harder). In particular, it can be chosen to match the first index of ∂ / ∂
Aij: :{\partial \det(A) \over \partial A_{ij}} = {\partial \sum_k A_{ik} \operatorname{adj}^{\rm T}(A)_{ik} \over \partial A_{ij}} = \sum_k {\partial (A_{ik} \operatorname{adj}^{\rm T}(A)_{ik}) \over \partial A_{ij}} Thus, by the
product rule, :{\partial \det(A) \over \partial A_{ij}} = \sum_k {\partial A_{ik} \over \partial A_{ij}} \operatorname{adj}^{\rm T}(A)_{ik} + \sum_k A_{ik} {\partial \operatorname{adj}^{\rm T}(A)_{ik} \over \partial A_{ij}}. Now, if an element of a matrix
Aij and a
cofactor adjT(
A)
ik of element
Aik lie on the same row (or column), then the cofactor will not be a function of
Aij, because the cofactor of
Aik is expressed in terms of elements not in its own row (nor column). Thus, :{\partial \operatorname{adj}^{\rm T}(A)_{ik} \over \partial A_{ij}} = 0, so :{\partial \det(A) \over \partial A_{ij}} = \sum_k \operatorname{adj}^{\rm T}(A)_{ik} {\partial A_{ik} \over \partial A_{ij}}. All the elements of
A are independent of each other, i.e. :{\partial A_{ik} \over \partial A_{ij}} = \delta_{jk}, where
δ is the
Kronecker delta, so :{\partial \det(A) \over \partial A_{ij}} = \sum_k \operatorname{adj}^{\rm T}(A)_{ik} \delta_{jk} = \operatorname{adj}^{\rm T}(A)_{ij}. Therefore, :d(\det(A)) = \sum_i \sum_j \operatorname{adj}^{\rm T}(A)_{ij} \,d A_{ij} = \sum_j \sum_i \operatorname{adj}(A)_{ji} \,d A_{ij} = \sum_j (\operatorname{adj}(A) \,d A)_{jj} = \operatorname{tr}(\operatorname{adj}(A) \,dA).\ \square
Via chain rule Lemma 1. \det'(I)=\mathrm{tr}, where \det' is the differential of \det. This equation means that the differential of \det, evaluated at the
identity matrix, is equal to the trace. The differential \det'(I) is a linear operator that maps an
n ×
n matrix to a
real number.
Proof. Using the definition of a
directional derivative together with one of its basic properties for differentiable functions, we have :\det'(I)(T)=\nabla_T \det(I)=\lim_{\varepsilon\to0}\frac{\det(I+\varepsilon T)-\det I}{\varepsilon} \det(I+\varepsilon T) is a polynomial in \varepsilon of order
n. It is closely related to the
characteristic polynomial of T. The
constant term in that polynomial (the term with \varepsilon = 0) is 1, while the linear term in \varepsilon is \mathrm{tr}\ T. Therefore the limit equals \mathrm{tr}\ T which is the claim.
Lemma 2. For an invertible matrix
A, we have: \det'(A)(T)=\det A \; \mathrm{tr}(A^{-1}T).
Proof. Consider the following function of
X: :\det X = \det (A A^{-1} X) = \det (A) \ \det(A^{-1} X) We calculate the differential of \det X and evaluate it at X = A using Lemma 1, the equation above, and the chain rule: :\det'(A)(T) = \det A \ \det'(I) (A^{-1} T) = \det A \ \mathrm{tr}(A^{-1} T)
Theorem. (Jacobi's formula) \frac{d}{dt} \det A = \mathrm{tr}\left(\mathrm{adj}\ A\frac{dA}{dt}\right)
Proof. If A is invertible, by Lemma 2, with T = dA/dt :\frac{d}{dt} \det A = \det A \; \mathrm{tr} \left(A^{-1} \frac{dA}{dt}\right) = \mathrm{tr} \left( \mathrm{adj}\ A \; \frac{dA}{dt} \right) using the equation relating the
adjugate of A to A^{-1}. Now, the formula holds for all matrices, since the set of invertible linear matrices is dense in the space of matrices.
Via diagonalization Both sides of the Jacobi formula are polynomials in the matrix coefficients of and . It is therefore sufficient to verify the polynomial identity on the dense subset where the eigenvalues of are distinct and nonzero. If factors differentiably as A=BC, then : \mathrm{tr}(A^{-1}A')= \mathrm{tr}((BC)^{-1}(BC)')= \mathrm{tr}(B^{-1}B')+ \mathrm{tr}(C^{-1}C'). In particular, if is invertible, then I=L^{-1}L and : 0=\mathrm{tr}(I^{-1}I')= \mathrm{tr}(L(L^{-1})')+ \mathrm{tr}(L^{-1}L'). Since has distinct eigenvalues, there exists a differentiable complex invertible matrix such that A = L^{-1}DL and is diagonal. Then : \mathrm{tr}(A^{-1}A')= \mathrm{tr}(L(L^{-1})')+ \mathrm{tr}(D^{-1}D')+ \mathrm{tr}(L^{-1}L')= \mathrm{tr}(D^{-1}D'). Let \lambda_i, i=1,\ldots,n be the eigenvalues of . Then : \left(\ln\det A\right)' = \left(\sum_{i=1}^{n}\ln \lambda_i \right)' = \sum_{i=1}^n \lambda_i'/\lambda_i = \mathrm{tr}(D^{-1}D')= \mathrm{tr}(A^{-1}A'), which is the Jacobi formula for matrices with distinct nonzero eigenvalues. ==Corollary==