Definition Let be a
field of
scalars. Let be an matrix, with column vectors . A
linear combination of these vectors is any vector of the form :c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + \cdots + c_n \mathbf{v}_n, where are scalars. The set of all possible linear combinations of is called the
column space of . That is, the column space of is the
span of the vectors . Any linear combination of the column vectors of a matrix can be written as the product of with a column vector: :\begin{array} {rcl} A \begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} & = & \begin{bmatrix} a_{11} & \cdots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn} \end{bmatrix} \begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} = \begin{bmatrix} c_1 a_{11} + \cdots + c_{n} a_{1n} \\ \vdots \\ c_{1} a_{m1} + \cdots + c_{n} a_{mn} \end{bmatrix} = c_1 \begin{bmatrix} a_{11} \\ \vdots \\ a_{m1} \end{bmatrix} + \cdots + c_n \begin{bmatrix} a_{1n} \\ \vdots \\ a_{mn} \end{bmatrix} \\ & = & c_1 \mathbf{v}_1 + \cdots + c_n \mathbf{v}_n \end{array} Therefore, the column space of consists of all possible products , for . This is the same as the
image (or
range) of the corresponding
matrix transformation.
Example If A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 2 & 0 \end{bmatrix}, then the column vectors are and . A linear combination of
v1 and
v2 is any vector of the form c_1 \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix} + c_2 \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} c_1 \\ c_2 \\ 2c_1 \end{bmatrix} The set of all such vectors is the column space of . In this case, the column space is precisely the set of vectors satisfying the equation (using
Cartesian coordinates, this set is a
plane through the origin in
three-dimensional space).
Basis The columns of span the column space, but they may not form a
basis if the column vectors are not
linearly independent. Fortunately,
elementary row operations do not affect the dependence relations between the column vectors. This makes it possible to use
row reduction to find a
basis for the column space. For example, consider the matrix :A = \begin{bmatrix} 1 & 3 & 1 & 4 \\ 2 & 7 & 3 & 9 \\ 1 & 5 & 3 & 1 \\ 1 & 2 & 0 & 8 \end{bmatrix}. The columns of this matrix span the column space, but they may not be
linearly independent, in which case some subset of them will form a basis. To find this basis, we reduce to
reduced row echelon form: :\begin{bmatrix} 1 & 3 & 1 & 4 \\ 2 & 7 & 3 & 9 \\ 1 & 5 & 3 & 1 \\ 1 & 2 & 0 & 8 \end{bmatrix} \sim \begin{bmatrix} 1 & 3 & 1 & 4 \\ 0 & 1 & 1 & 1 \\ 0 & 2 & 2 & -3 \\ 0 & -1 & -1 & 4 \end{bmatrix} \sim \begin{bmatrix} 1 & 0 & -2 & 1 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 0 & -5 \\ 0 & 0 & 0 & 5 \end{bmatrix} \sim \begin{bmatrix} 1 & 0 & -2 & 0 \\ 0 & 1 & 1 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 \end{bmatrix}. At this point, it is clear that the first, second, and fourth columns are linearly independent, while the third column is a linear combination of the first two. (Specifically, .) Therefore, the first, second, and fourth columns of the original matrix are a basis for the column space: :\begin{bmatrix} 1 \\ 2 \\ 1 \\ 1\end{bmatrix},\;\; \begin{bmatrix} 3 \\ 7 \\ 5 \\ 2\end{bmatrix},\;\; \begin{bmatrix} 4 \\ 9 \\ 1 \\ 8\end{bmatrix}. Note that the independent columns of the reduced row echelon form are precisely the columns with
pivots. This makes it possible to determine which columns are linearly independent by reducing only to
echelon form. The above algorithm can be used in general to find the dependence relations between any set of vectors, and to pick out a basis from any spanning set. Also finding a basis for the column space of is equivalent to finding a basis for the row space of the
transpose matrix . To find the basis in a practical setting (e.g., for large matrices), the
singular-value decomposition is typically used.
Dimension The
dimension of the column space is called the
rank of the matrix. The rank is equal to the number of pivots in the
reduced row echelon form, and is the maximum number of linearly independent columns that can be chosen from the matrix. For example, the 4 × 4 matrix in the example above has rank three. Because the column space is the
image of the corresponding
matrix transformation, the rank of a matrix is the same as the dimension of the image. For example, the transformation \R^4 \to \R^4 described by the matrix above maps all of \R^4 to some three-dimensional
subspace. The
nullity of a matrix is the dimension of the
null space, and is equal to the number of columns in the reduced row echelon form that do not have pivots. The rank and nullity of a matrix with columns are related by the equation: :\operatorname{rank}(A) + \operatorname{nullity}(A) = n.\, This is known as the
rank–nullity theorem.
Relation to the left null space The
left null space of is the set of all vectors such that . It is the same as the
null space of the
transpose of . The product of the matrix and the vector can be written in terms of the
dot product of vectors: :A^\mathsf{T}\mathbf{x} = \begin{bmatrix} \mathbf{v}_1 \cdot \mathbf{x} \\ \mathbf{v}_2 \cdot \mathbf{x} \\ \vdots \\ \mathbf{v}_n \cdot \mathbf{x} \end{bmatrix}, because
row vectors of are transposes of column vectors of . Thus if and only if is
orthogonal (perpendicular) to each of the column vectors of . It follows that the left null space (the null space of ) is the
orthogonal complement to the column space of . For a matrix , the column space, row space, null space, and left null space are sometimes referred to as the
four fundamental subspaces.
For matrices over a ring Similarly the column space (sometimes disambiguated as
right column space) can be defined for matrices over a
ring as :\sum\limits_{k=1}^n \mathbf{v}_k c_k for any , with replacement of the vector -space with "
right free module", which changes the order of
scalar multiplication of the vector to the scalar such that it is written in an unusual order
vector–
scalar. ==Row space==