There are many different kinds of products in linear algebra. Some of these have confusingly similar names (
outer product,
exterior product) with very different meanings, while others have very different names (outer product, tensor product, Kronecker product) and yet convey essentially the same idea. A brief overview of these is given in the following sections.
Scalar multiplication By the very definition of a vector space, one can form the product of any scalar with any vector, giving a map \R \times V \rightarrow V.
Scalar product A
scalar product is a bi-linear map: :\cdot : V \times V \rightarrow \R with the following conditions, that v \cdot v > 0 for all 0 \not= v \in V. From the scalar product, one can define a
norm by letting \|v\| := \sqrt{v \cdot v} . The scalar product also allows one to define an angle between two vectors: :\cos\angle(v, w) = \frac{v \cdot w}{\|v\| \cdot \|w\|} In n-dimensional
Euclidean space, the standard scalar product (called the
dot product) is given by: :\biggl(\sum_{i=1}^n \alpha_i e_i\biggr) \cdot \biggl(\sum_{i=1}^n \beta_i e_i\biggr) = \sum_{i=1}^n \alpha_i\,\beta_i
Cross product in 3-dimensional space The
cross product of two vectors in 3-dimensions is a vector perpendicular to the two factors, with length equal to the area of the
parallelogram spanned by the two factors. The cross product can also be expressed as the
formal determinant: :\mathbf{u \times v} = \begin{vmatrix} \mathbf{i} & \mathbf{j} & \mathbf{k} \\ u_1 & u_2 & u_3 \\ v_1 & v_2 & v_3 \\ \end{vmatrix}
Composition of linear mappings A linear mapping can be defined as a function
f between two vector spaces
V and
W with underlying field
F, satisfying :f(t_1 x_1 + t_2 x_2) = t_1 f(x_1) + t_2 f(x_2), \forall x_1, x_2 \in V, \forall t_1, t_2 \in \mathbb{F}. If one only considers finite dimensional vector spaces, then :f(\mathbf{v}) = f\left(v_i \mathbf{b_V}^i\right) = v_i f\left(\mathbf{b_V}^i\right) = {f^i}_j v_i \mathbf{b_W}^j, in which
bV and
bW denote the
bases of
V and
W, and
vi denotes the
component of
v on
bVi, and
Einstein summation convention is applied. Now we consider the composition of two linear mappings between finite dimensional vector spaces. Let the linear mapping
f map
V to
W, and let the linear mapping
g map
W to
U. Then one can get :g \circ f(\mathbf{v}) = g\left({f^i}_j v_i \mathbf{b_W}^j\right) = {g^j}_k {f^i}_j v_i \mathbf{b_U}^k. Or in matrix form: :g \circ f(\mathbf{v}) = \mathbf{G} \mathbf{F} \mathbf{v}, in which the
i-row,
j-column element of
F, denoted by
Fij, is
fji, and
Gij=gji. The composition of more than two linear mappings can be similarly represented by a chain of matrix multiplication.
Product of two matrices Given two
matrices with real-valued entries, in {{tmath|\textstyle \R^{s\times r} }} and in {{tmath|\textstyle \R^{r\times t} }} (the number of columns of must match the number of rows of ), their product is a matrix in {{tmath|\textstyle \R^{s\times t} }} whose entries are given by a sum of pairwise products of the entries in the corresponding row of and column of : :c_{ij} = \sum_{k=1}^r a_{ik} b_{kj} = a_{i1} b_{1j} + a_{i2} b_{2j} + \cdots + a_{ir} b_{rj}
Composition of linear functions as matrix product There is a relationship between the composition of linear functions and the product of two matrices. To see this, let r = dim(U), s = dim(V) and t = dim(W) be the (finite)
dimensions of vector spaces U, V and W. Let \mathcal U = \{u_1, \ldots, u_r\} be a
basis of U, \mathcal V = \{v_1, \ldots, v_s\} be a basis of V and \mathcal W = \{w_1, \ldots, w_t\} be a basis of W. In terms of this basis, let A = M^{\mathcal U}_{\mathcal V}(f) \in \R^{s\times r} be the matrix representing f : U → V and B = M^{\mathcal V}_{\mathcal W}(g) \in \R^{r\times t} be the matrix representing g : V → W. Then :B\cdot A = M^{\mathcal U}_{\mathcal W} (g \circ f) \in \R^{s\times t} is the matrix representing g \circ f : U \rightarrow W. In other words: the matrix product is the description in coordinates of the composition of linear functions.
Tensor product of vector spaces Given two finite dimensional vector spaces
V and
W, the tensor product of them can be defined as a (2,0)-tensor satisfying: :V \otimes W(v, m) = V(v) W(w), \forall v \in V^*, \forall w \in W^*, where
V* and
W* denote the
dual spaces of
V and
W. For infinite-dimensional vector spaces, one also has the: •
Tensor product of Hilbert spaces •
Topological tensor product. The tensor product,
outer product and
Kronecker product all convey the same general idea. The differences between these are that the Kronecker product is just a tensor product of matrices, with respect to a previously-fixed basis, whereas the tensor product is usually given in its
intrinsic definition. The outer product is simply the Kronecker product, limited to vectors (instead of matrices).
The class of all objects with a tensor product In general, whenever one has two mathematical
objects that can be combined in a way that behaves like a linear algebra tensor product, then this can be most generally understood as the
internal product of a
monoidal category. That is, the monoidal category captures precisely the meaning of a tensor product; it captures exactly the notion of why it is that tensor products behave the way they do. More precisely, a monoidal category is the
class of all things (of a given
type) that have a tensor product.
Other products in linear algebra Other kinds of products in linear algebra include: •
Hadamard product •
Kronecker product • The product of
tensors: •
Wedge product or exterior product •
Interior product •
Outer product •
Tensor product ==Cartesian product==