Equivalence of definitions The set of all linear combinations of a subset of , a vector space over , is the smallest linear subspace of containing . :
Proof. We first prove that is a subspace of . Since is a subset of , we only need to prove the existence of a zero vector in , that is closed under addition, and that is closed under scalar multiplication. Letting S = \{ \mathbf v_1, \mathbf v_2, \ldots, \mathbf v_n \}, it is trivial that the zero vector of exists in , since \mathbf 0 = 0 \mathbf v_1 + 0 \mathbf v_2 + \cdots + 0 \mathbf v_n. Adding together two linear combinations of also produces a linear combination of : (\lambda_1 \mathbf v_1 + \cdots + \lambda_n \mathbf v_n) + (\mu_1 \mathbf v_1 + \cdots + \mu_n \mathbf v_n) = (\lambda_1 + \mu_1) \mathbf v_1 + \cdots + (\lambda_n + \mu_n) \mathbf v_n, where all \lambda_i, \mu_i \in K, and multiplying a linear combination of by a scalar c \in K will produce another linear combination of : c(\lambda_1 \mathbf v_1 + \cdots + \lambda_n \mathbf v_n) = c\lambda_1 \mathbf v_1 + \cdots + c\lambda_n \mathbf v_n. Thus is a subspace of . :It follows that S \subseteq \operatorname{span} S, since every is a linear combination of (trivially). Suppose that is a linear subspace of containing . Since is closed under addition and scalar multiplication, then every linear combination \lambda_1 \mathbf v_1 + \cdots + \lambda_n \mathbf v_n must be contained in . Thus, is contained in every subspace of containing , and the intersection of all such subspaces, or the smallest such subspace, is equal to the set of all linear combinations of .
Size of spanning set is at least size of linearly independent set Every spanning set of a vector space must contain at least as many elements as any
linearly independent set of vectors from . :
Proof. Let S = \{ \mathbf v_1, \ldots, \mathbf v_m \} be a spanning set and W = \{ \mathbf w_1, \ldots, \mathbf w_n \} be a linearly independent set of vectors from . We want to show that m \geq n. :Since spans , then S \cup \{ \mathbf w_1 \} must also span , and \mathbf w_1 must be a linear combination of . Thus S \cup \{ \mathbf w_1 \} is linearly dependent, and we can remove one vector from that is a linear combination of the other elements. This vector cannot be any of the , since is linearly independent. The resulting set is \{ \mathbf w_1, \mathbf v_1, \ldots, \mathbf v_{i-1}, \mathbf v_{i+1}, \ldots, \mathbf v_m \}, which is a spanning set of . We repeat this step times, where the resulting set after the th step is the union of \{ \mathbf w_1, \ldots, \mathbf w_p \} and vectors of . :It is ensured until the th step that there will always be some to remove out of for every adjoint of , and thus there are at least as many 's as there are 's—i.e. m \geq n. To verify this, we assume by way of contradiction that m . Then, at the th step, we have the set \{ \mathbf w_1, \ldots, \mathbf w_m \} and we can adjoin another vector \mathbf w_{m+1}. But, since \{ \mathbf w_1, \ldots, \mathbf w_m \} is a spanning set of , \mathbf w_{m+1} is a linear combination of \{ \mathbf w_1, \ldots, \mathbf w_m \}. This is a contradiction, since is linearly independent.
Spanning set can be reduced to a basis Let be a finite-dimensional vector space. Any set of vectors that spans can be reduced to a
basis for , by discarding vectors if necessary (i.e. if there are linearly dependent vectors in the set). If the
axiom of choice holds, this is true without the assumption that has finite dimension. This also indicates that a basis is a minimal spanning set when is finite-dimensional. == Generalizations ==