Complex vectors For vectors with
complex entries, using the given definition of the dot product would lead to quite different properties. For instance, the dot product of a vector with itself could be zero without the vector being the zero vector (e.g. this would happen with the vector {{nowrap|\mathbf{a} = [1\ i]).}} This in turn would have consequences for notions like length and angle. Properties such as the positive-definite norm can be salvaged at the cost of giving up the symmetric and bilinear properties of the dot product, through the alternative definition \mathbf{a} \cdot \mathbf{b} = \sum_i {{a_i}\,\overline{b_i}} , where \overline{b_i} is the
complex conjugate of b_i. When vectors are represented by
column vectors, the dot product can be expressed as a
matrix product involving a
conjugate transpose, denoted with the superscript H: \mathbf{a} \cdot \mathbf{b} = \mathbf{b}^\mathsf{H} \mathbf{a} . In the case of vectors with real components, this definition is the same as in the real case. The dot product of any vector with itself is a non-negative real number, and it is nonzero except for the zero vector. However, the complex dot product is
sesquilinear rather than bilinear, as it is
conjugate linear and not linear in \mathbf{a}. The dot product is not symmetric, since \mathbf{a} \cdot \mathbf{b} = \overline{\mathbf{b} \cdot \mathbf{a}} . The angle between two complex vectors is then given by \cos \theta = \frac{\operatorname{Re} ( \mathbf{a} \cdot \mathbf{b} )}{ \left\| \mathbf{a} \right\| \left\| \mathbf{b} \right\| } . The complex dot product leads to the notions of
Hermitian forms and general
inner product spaces, which are widely used in mathematics and
physics. The self dot product of a complex vector \mathbf{a} \cdot \mathbf{a} = \mathbf{a}^\mathsf{H} \mathbf{a} , involving the conjugate transpose of a row vector, is also known as the
norm squared, \mathbf{a} \cdot \mathbf{a} = \|\mathbf{a}\|^2, after the
Euclidean norm; it is a vector generalization of the
absolute square of a complex scalar (see also:
Squared Euclidean distance).
Inner product The inner product generalizes the dot product to
abstract vector spaces over a
field of
scalars, being either the field of
real numbers \R or the field of
complex numbers \Complex . It is usually denoted using
angular brackets by \left\langle \mathbf{a} \, , \mathbf{b} \right\rangle . The inner product of two vectors over the field of complex numbers is, in general, a complex number, and is
sesquilinear instead of bilinear. An inner product space is a
normed vector space, and the inner product of a vector with itself is real and positive-definite.
Functions The dot product is defined for vectors that have a finite number of
entries. Thus these vectors can be regarded as
discrete functions: a length-n vector u is, then, a function with
domain \{k\in\mathbb{N}:1\leq k \leq n\}, and u_i is a notation for the image of i by the function/vector u. This notion can be generalized to
square-integrable functions: just as the inner product on vectors uses a sum over corresponding components, the inner product on functions is defined as an integral over some
measure space (X, \mathcal{A}, \mu): \left\langle u , v \right\rangle = \int_X u v \, \text{d} \mu. For example, if f and g are
continuous functions over a
compact subset K of \mathbb{R}^n with the standard
Lebesgue measure, the above definition becomes: \left\langle f , g \right\rangle = \int_K f(\mathbf{x}) g(\mathbf{x}) \, \operatorname{d}^n \mathbf{x} . Generalized further to
complex continuous functions \psi and \chi, by analogy with the complex inner product above, gives: \left\langle \psi, \chi \right\rangle = \int_K \psi(z) \overline{\chi(z)} \, \text{d} z.
Weight function Inner products can have a
weight function (i.e., a function which weights each term of the inner product with a value). Explicitly, the inner product of functions u(x) and v(x) with respect to the weight function r(x)>0 is \left\langle u , v \right\rangle_r = \int_a^b r(x) u(x) v(x) \, d x.
Dyadics and matrices A double-dot product for
matrices is the
Frobenius inner product, which is analogous to the dot product on vectors. It is defined as the sum of the products of the corresponding components of two matrices \mathbf{A} and \mathbf{B} of the same size: \mathbf{A} : \mathbf{B} = \sum_i \sum_j A_{ij} \overline{B_{ij}} = \operatorname{tr} ( \mathbf{B}^\mathsf{H} \mathbf{A} ) = \operatorname{tr} ( \mathbf{A} \mathbf{B}^\mathsf{H} ) . And for real matrices, \mathbf{A} : \mathbf{B} = \sum_i \sum_j A_{ij} B_{ij} = \operatorname{tr} ( \mathbf{B}^\mathsf{T} \mathbf{A} ) = \operatorname{tr} ( \mathbf{A} \mathbf{B}^\mathsf{T} ) = \operatorname{tr} ( \mathbf{A}^\mathsf{T} \mathbf{B} ) = \operatorname{tr} ( \mathbf{B} \mathbf{A}^\mathsf{T} ) . Writing a matrix as a
dyadic, we can define a different double-dot product (see '''') however it is not an inner product.
Tensors The inner product between a
tensor of order n and a tensor of order m is a tensor of order n+m-2, see
Tensor contraction for details. == Computation ==