Truncated singular value decomposition (SVD) in numerical linear algebra can also use the
Rayleigh–Ritz method to find approximations to left and right singular vectors of the matrix M \in \mathbb{C}^{M \times N} of size M \times N in given subspaces by turning the singular value problem into an eigenvalue problem.
Using the normal matrix The definition of the singular value \sigma and the corresponding left and right singular vectors is M v = \sigma u and M^* u = \sigma v. Having found one set (left of right) of approximate singular vectors and singular values by applying naively the Rayleigh–Ritz method to the
Hermitian normal matrix M^* M \in \mathbb{C}^{N \times N} or M M^* \in \mathbb{C}^{M \times M}, whichever one is smaller size, one could determine the other set of left of right singular vectors simply by dividing by the singular values, i.e., u = Mv / \sigma and v = M^* u / \sigma. However, the division is unstable or fails for small or zero singular values. An alternative approach, e.g., defining the normal matrix as A = M^* M \in \mathbb{C}^{N \times N} of size N \times N, takes advantage of the fact that for a given N \times m matrix W \in \mathbb{C}^{N \times m} with
orthonormal columns the eigenvalue problem of the Rayleigh–Ritz method for the m \times m matrix W^* A W = W^* M^* M W = (M W)^* M W can be interpreted as a singular value problem for the N \times m matrix M W. This interpretation allows simple simultaneous calculation of both left and right approximate singular vectors as follows. • Compute the N \times m matrix M W . • Compute the
thin, or economy-sized, SVD M W = \mathbf {U} \Sigma \mathbf V_h, with N \times m matrix \mathbf U, m \times m diagonal matrix \Sigma, and m \times m matrix \mathbf {V}_h. • Compute the matrices of the Ritz left U = \mathbf U and right V_h = \mathbf {V}_h W^* singular vectors. • Output approximations U, \Sigma, V_h, called the Ritz singular triplets, to selected singular values and the corresponding left and right singular vectors of the original matrix M representing an approximate
Truncated singular value decomposition (SVD) with left singular vectors restricted to the column-space of the matrix W. The algorithm can be used as a post-processing step where the matrix W is an output of an eigenvalue solver, e.g., such as
LOBPCG, approximating numerically selected eigenvectors of the normal matrix A = M^* M.
Example The matrix M = \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 4 \\ 0 & 0 & 0 & 0 \end{bmatrix} has its normal matrix A = M^* M = \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 4 & 0 & 0 \\ 0 & 0 & 9 & 0 \\ 0 & 0 & 0 & 16 \\ \end{bmatrix}, singular values 1, 2, 3, 4 and the corresponding
thin SVD A = \begin{bmatrix} 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 4 & 0 & 0 & 0 \\ 0 & 3 & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix} \begin{bmatrix} 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \end{bmatrix}, where the columns of the first multiplier from the complete set of the left singular vectors of the matrix A, the diagonal entries of the middle term are the singular values, and the columns of the last multiplier transposed (although the transposition does not change it) \begin{bmatrix} 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \end{bmatrix}^* \quad = \quad \begin{bmatrix} 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \end{bmatrix} are the corresponding right singular vectors. Let us take W = \begin{bmatrix} 1 / \sqrt{2} & 1 / \sqrt{2} \\ 1 / \sqrt{2} & -1 / \sqrt{2} \\ 0 & 0 \\ 0 & 0 \end{bmatrix} with the column-space that is spanned by the two exact right singular vectors \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ 0 & 0 \\ 0 & 0 \end{bmatrix} corresponding to the singular values 1 and 2. Following the algorithm step 1, we compute MW = \begin{bmatrix} 1 / \sqrt{2} & 1 / \sqrt{2} \\ \sqrt{2} & -\sqrt{2} \\ 0 & 0 \\ 0 & 0 \end{bmatrix}, and on step 2 its
thin SVD M W = \mathbf {U}\mathbf {V}_h with \mathbf {U} = \begin{bmatrix} 0 & 1 \\ 1 & 0 \\ 0 & 0 \\ 0 & 0 \\ 0 & 0 \end{bmatrix}, \quad \Sigma = \begin{bmatrix} 2 & 0 \\ 0 & 1 \end{bmatrix}, \quad \mathbf {V}_h = \begin{bmatrix} 1 / \sqrt{2} & -1 / \sqrt{2} \\ 1 / \sqrt{2} & 1 / \sqrt{2} \end{bmatrix}. Thus we already obtain the singular values 2 and 1 from \Sigma and from \mathbf {U} the corresponding two left singular vectors u as [0, 1, 0, 0, 0]^* and [1, 0, 0, 0, 0]^*, which span the column-space of the matrix W, explaining why the approximations are exact for the given W. Finally, step 3 computes the matrix V_h = \mathbf {V}_h W^* \mathbf {V}_h = \begin{bmatrix} 1 / \sqrt{2} & -1 / \sqrt{2} \\ 1 / \sqrt{2} & 1 / \sqrt{2} \end{bmatrix} \, \begin{bmatrix} 1 / \sqrt{2} & 1 / \sqrt{2} & 0 & 0 \\ 1 / \sqrt{2} & -1 / \sqrt{2} & 0 & 0 \end{bmatrix} = \begin{bmatrix} 0 & 1 & 0 & 0 \\ 1 & 0 & 0 & 0 \end{bmatrix} recovering from its rows the two right singular vectors v as [0, 1, 0, 0]^* and [1, 0, 0, 0]^*. We validate the first vector: M v = \sigma u \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 4 \\ 0 & 0 & 0 & 0 \end{bmatrix} \, \begin{bmatrix} 0 \\ 1 \\ 0 \\ 0 \end{bmatrix} = \, 2 \, \begin{bmatrix} 0 \\ 1 \\ 0 \\ 0 \\ 0 \end{bmatrix} and M^* u = \sigma v \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 & 0 \\ 0 & 0 & 3 & 0 & 0 \\ 0 & 0 & 0 & 4 & 0 \end{bmatrix} \, \begin{bmatrix} 0 \\ 1 \\ 0 \\ 0 \\ 0 \end{bmatrix} = \, 2 \, \begin{bmatrix} 0 \\ 1 \\ 0 \\ 0 \end{bmatrix}. Thus, for the given matrix W with its column-space that is spanned by two exact right singular vectors, we determine these right singular vectors, as well as the corresponding left singular vectors and the singular values, all exactly. For an arbitrary matrix W, we obtain approximate singular triplets which are optimal given W in the sense of optimality of the Rayleigh–Ritz method. == Applications and examples ==