Direct sums and block diagonal matrices Direct sum For any arbitrary matrices
A (of size
m ×
n) and
B (of size
p ×
q), we have the
direct sum of
A and
B, denoted by
A ⊕
B and defined as That is, a block diagonal matrix
A has the form {A} = \begin{bmatrix} A_1 & 0 & \cdots & 0 \\ 0 & A_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & A_n \end{bmatrix} where
Ak is a square matrix for all
k = 1, ...,
n. In other words, matrix
A is the
direct sum of
A1, ...,
An. (the latter being the same formalism used for a
diagonal matrix). Any square matrix can trivially be considered a block diagonal matrix with only one block. For the
determinant and
trace, the following properties hold: :\begin{align} \det{A} &= \det{A}_1 \times \cdots \times \det{A}_n, \end{align} and :\begin{align} \operatorname{tr}{A} &= \operatorname{tr} {A}_1 + \cdots + \operatorname{tr} {A}_n.\end{align} \begin{bmatrix} {A}_{1} & {0} & \cdots & {0} \\ {0} & {A}_{2} & \cdots & {0} \\ \vdots & \vdots & \ddots & \vdots \\ {0} & {0} & \cdots & {A}_{n} \end{bmatrix}^{-1} = \begin{bmatrix} {A}_{1}^{-1} & {0} & \cdots & {0} \\ {0} & {A}_{2}^{-1} & \cdots & {0} \\ \vdots & \vdots & \ddots & \vdots \\ {0} & {0} & \cdots & {A}_{n}^{-1} \end{bmatrix}. The
eigenvalues Block tridiagonal matrices are often encountered in numerical solutions of engineering problems (e.g.,
computational fluid dynamics). Optimized numerical methods for
LU factorization are available and hence efficient solution algorithms for equation systems with a block tridiagonal matrix as coefficient matrix. The
Thomas algorithm, used for efficient solution of equation systems involving a
tridiagonal matrix can also be applied using matrix operations to block tridiagonal matrices (see also
Block LU decomposition).
Block triangular matrices An n \times n matrix A is
upper block triangular (or
block upper triangular) if there are positive integers n_1, \ldots, n_k such that n = n_1 + n_2 + \ldots + n_k and A = \begin{bmatrix} A_{11} & A_{12} & \cdots & A_{1k} \\ 0 & A_{22} & \cdots & A_{2k} \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & A_{kk} \end{bmatrix}, where the matrix A_{ij} is n_i \times n_j for all i, j = 1, \ldots, k. Similarly, A is
lower block triangular if A = \begin{bmatrix} A_{11} & 0 & \cdots & 0 \\ A_{21} & A_{22} & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ A_{k1} & A_{k2} & \cdots & A_{kk} \end{bmatrix}, where A_{ij} is n_i \times n_j for all i, j = 1, \ldots, k.
Block Toeplitz matrices A
block Toeplitz matrix is another special block matrix, which contains blocks that are repeated down the diagonals of the matrix, as a
Toeplitz matrix has elements repeated down the diagonal. A matrix A is
block Toeplitz if A_{(i,j)} = A_{(k,l)} for all k - i = l - j, that is, A = \begin{bmatrix} A_1 & A_2 & A_3 & \cdots \\ A_4 & A_1 & A_2 & \cdots \\ A_5 & A_4 & A_1 & \cdots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix}, where A_i \in \mathbb{F}^{n_i \times m_i}.
Block Hankel matrices A matrix A is
block Hankel if A_{(i,j)} = A_{(k,l)} for all i + j = k + l, that is, A = \begin{bmatrix} A_1 & A_2 & A_3 & \cdots \\ A_2 & A_3 & A_4 & \cdots \\ A_3 & A_4 & A_5 & \cdots \\ \vdots & \vdots & \vdots & \ddots \end{bmatrix}, where A_i \in \mathbb{F}^{n_i \times m_i}. ==See also==