MarketPolarization identity
Company Profile

Polarization identity

In linear algebra, the polarization identity is any one of a family of formulas that express the inner product of two vectors in terms of the norm of a normed vector space. If a norm arises from an inner product then the polarization identity can be used to express this inner product entirely in terms of the norm. The polarization identity shows that a norm can arise from at most one inner product; however, there exist norms that do not arise from any inner product.

Polarization identities
Any inner product on a vector space induces a norm by the equation \|x\| = \sqrt{\langle x, x \rangle}. The polarization identities reverse this relationship, recovering the inner product from the norm. Every inner product satisfies: \|x + y\|^2 = \|x\|^2 + \|y\|^2 + 2\operatorname{Re}\langle x, y \rangle \qquad \text{ for all vectors } x, y. Solving for \operatorname{Re}\langle x, y \rangle gives the formula \operatorname{Re}\langle x, y \rangle = \frac{1}{2} \left(\|x+y\|^2 - \|x\|^2 - \|y\|^2\right). If the inner product is real then \operatorname{Re}\langle x, y \rangle = \langle x, y \rangle and this formula becomes a polarization identity for real inner products. Real vector spaces If the vector space is over the real numbers then the polarization identities are: \begin{alignat}{4} \langle x, y \rangle &= \frac{1}{4} \left(\|x+y\|^2 - \|x-y\|^2\right) \\[3pt] &= \frac{1}{2} \left(\|x+y\|^2 - \|x\|^2 - \|y\|^2\right) \\[3pt] &= \frac{1}{2} \left(\|x\|^2 + \|y\|^2 - \|x-y\|^2\right). \\[3pt] \end{alignat} These various forms are all equivalent by the parallelogram law: \langle x, y \rangle = \frac{1}{4} \sum_{k=0}^3 i^k \left\|x + i^k y\right\|^2. Summary of both cases Thus if R(x, y) + i I(x, y) denotes the real and imaginary parts of some inner product's value at the point (x, y) \in H \times H of its domain, then its imaginary part will be: I(x, y) ~=~ \begin{cases} ~R({\color{red}i} x, y) & \qquad \text{ if antilinear in the } {\color{red}1} \text{st argument} \\ ~R(x, {\color{blue}i} y) & \qquad \text{ if antilinear in the } {\color{blue}2} \text{nd argument} \\ \end{cases} where the scalar i is always located in the same argument that the inner product is antilinear in. Using , the above formula for the imaginary part becomes: I(x, y) ~=~ \begin{cases} -R(x, {\color{black}i} y) & \qquad \text{ if antilinear in the } {\color{black}1} \text{st argument} \\ -R({\color{black}i} x, y) & \qquad \text{ if antilinear in the } {\color{black}2} \text{nd argument} \\ \end{cases} == Reconstructing the inner product ==
Reconstructing the inner product
In a normed space (H, \|\cdot\|), if the parallelogram law \|x+y\|^2 ~+~ \|x-y\|^2 ~=~ 2\|x\|^2+2\|y\|^2 holds, then there exists a unique inner product \langle \cdot,\ \cdot\rangle on H such that \|x\|^2 = \langle x,\ x\rangle for all x \in H. {{math proof|proof= We will only give the real case here; the proof for complex vector spaces is analogous. By the above formulas, if the norm is described by an inner product (as we hope), then it must satisfy \langle x, \ y \rangle = \frac{1}{4} \left(\|x+y\|^2 - \|x-y\|^2\right) \quad \text{ for all } x, y \in H, which may serve as a definition of the unique candidate \langle \cdot, \cdot \rangle for the role of a suitable inner product. Thus, the uniqueness is guaranteed. It remains to prove that this formula indeed defines an inner product and that this inner product induces the norm \|\cdot\|. Explicitly, the following will be shown: • \langle x, x \rangle = \|x\|^2, \quad x \in H • \langle x, y \rangle = \langle y, x \rangle, \quad x, y \in H • \langle x+z, y\rangle = \langle x, y\rangle + \langle z, y\rangle \quad \text{ for all } x, y, z \in H, • \langle \alpha x, y \rangle = \alpha\langle x, y \rangle \quad \text{ for all } x, y \in H \text{ and all } \alpha \in \R (This axiomatization omits positivity, which is implied by (1) and the fact that \|\cdot\| is a norm.) For properties (1) and (2), substitute: \langle x, x \rangle = \frac{1}{4} \left(\|x+x\|^2 - \|x-x\|^2\right) = \|x\|^2, and \|x-y\|^2 = \|y-x\|^2. For property (3), it is convenient to work in reverse. It remains to show that \|x+z+y\|^2 - \|x+z-y\|^2 \overset{?}{=} \|x+y\|^2 - \|x-y\|^2 + \|z+y\|^2 - \|z-y\|^2 or equivalently, 2\left(\|x+z+y\|^2 + \|x-y\|^2\right) - 2\left(\|x+z-y\|^2 + \|x+y\|^2\right) \overset{?}{=} 2\|z+y\|^2 - 2\|z-y\|^2. Now apply the parallelogram identity: 2\|x+z+y\|^2 + 2\|x-y\|^2 = \|2x+z\|^2 + \|2y+z\|^2 2\|x+z-y\|^2 + 2\|x+y\|^2 = \|2x+z\|^2 + \|z-2y\|^2 Thus it remains to verify: \cancel{\|2x+z\|^2} + \|2y+z\|^2 - (\cancel{\|2x+z\|^2} + \|z-2y\|^2) \overset{?}{{}={}} 2\|z+y\|^2 - 2\|z-y\|^2 \|2y+z\|^2 - \|z-2y\|^2 \overset{?}{=} 2\|z+y\|^2 - 2\|z-y\|^2 But the latter claim can be verified by subtracting the following two further applications of the parallelogram identity: \|2y+z\|^2 + \|z\|^2 = 2\|z+y\|^2 + 2\|y\|^2 \|z-2y\|^2 + \|z\|^2 = 2\|z-y\|^2 + 2\|y\|^2 Thus (3) holds. It can be verified by induction that (3) implies (4), as long as \alpha \in \Z. But "(4) when \alpha \in \Z" implies "(4) when \alpha \in \Q". And any positive-definite, real-valued, \Q-bilinear form satisfies the Cauchy–Schwarz inequality, so that \langle \sdot,\sdot \rangle is continuous. Thus \langle \sdot,\sdot \rangle must be \R-linear as well. }} Another necessary and sufficient condition for there to exist an inner product that induces a given norm \|\cdot\| is for the norm to satisfy Ptolemy's inequality, which is: \|x - y\| \, \|z\| ~+~ \|y - z\| \, \|x\| ~\geq~ \|x - z\| \, \|y\| \qquad \text{ for all vectors } x, y, z. == Applications and consequences ==
Applications and consequences
If H is a complex Hilbert space then \langle x \mid y \rangle is real if and only if its imaginary part is {{tmath|1= 0 = R(x, iy) = \frac{1}{4} \left(\Vert x+iy \Vert^2 - \Vert x-iy \Vert^2\right) }}, which happens if and only if . Similarly, \langle x \mid y \rangle is (purely) imaginary if and only if . For example, from \|x+ix\| = |1+i| \|x\| = \sqrt{2} \|x\| = |1-i| \|x\| = \|x-ix\| it can be concluded that \langle x | x \rangle is real and that \langle x | ix \rangle is purely imaginary. Isometries If A : H \to Z is a linear isometry between two Hilbert spaces (so \|A h\| = \|h\| for all h \in H) then \langle A h, A k \rangle_Z = \langle h, k \rangle_H \quad \text{ for all } h, k \in H; that is, linear isometries preserve inner products. If A : H \to Z is instead an antilinear isometry then \langle A h, A k \rangle_Z = \overline{\langle h, k \rangle_H} = \langle k, h \rangle_H \quad \text{ for all } h, k \in H. Relation to the law of cosines The second form of the polarization identity can be written as \|\textbf{u}-\textbf{v}\|^2 = \|\textbf{u}\|^2 + \|\textbf{v}\|^2 - 2(\textbf{u} \cdot \textbf{v}). This is essentially a vector form of the law of cosines for the triangle formed by the vectors {{tmath|1= \textbf{u} }}, {{tmath|1= \textbf{v} }}, and {{tmath|1= \textbf{u}-\textbf{v} }}. In particular, \textbf{u}\cdot\textbf{v} = \|\textbf{u}\|\,\|\textbf{v}\| \cos\theta, where \theta is the angle between the vectors \textbf{u} and {{tmath|1= \textbf{v} }}. The equation is numerically unstable if u and v are similar because of catastrophic cancellation and should be avoided for numeric computation. Derivation The basic relation between the norm and the dot product is given by the equation \|\textbf{v}\|^2 = \textbf{v} \cdot \textbf{v}. Then \begin{align} \|\textbf{u} + \textbf{v}\|^2 &= (\textbf{u} + \textbf{v}) \cdot (\textbf{u} + \textbf{v}) \\[3pt] &= (\textbf{u} \cdot \textbf{u}) + (\textbf{u} \cdot \textbf{v}) + (\textbf{v} \cdot \textbf{u}) + (\textbf{v} \cdot \textbf{v}) \\[3pt] &= \|\textbf{u}\|^2 + \|\textbf{v}\|^2 + 2(\textbf{u} \cdot \textbf{v}), \end{align} and similarly \|\textbf{u} - \textbf{v}\|^2 = \|\textbf{u}\|^2 + \|\textbf{v}\|^2 - 2(\textbf{u} \cdot \textbf{v}). Forms (1) and (2) of the polarization identity now follow by solving these equations for {{tmath|1= \textbf{u} \cdot \textbf{v} }}, while form (3) follows from subtracting these two equations. (Adding these two equations together gives the parallelogram law.) == Generalizations ==
Generalizations
Jordan–von Neumann theorems The standard Jordanvon Neumann theorem, as stated previously, is that the if a norm satisfies the parallelogram law, then it can be induced by an inner product defined by the polarization identity. There are variants of the theorem. Define various senses of orthogonality: • isosceles: \|x+y \| =\|x-y \| • Roberts’: \left\|x+ty\right\|=\left\|x-ty\right\| for all scalar t. • Pythagorean: \left\|x+y\right\|^2=\|x\|^2+\left\|y\right\|^2 • Birkhoff–James: \|x\| \leq \|x + ty \| for all scalar t. Let V be a vector space over the real or complex numbers. Let \|\cdot\| be a norm over V. We consider conditions for which the norm is induced by an inner product. In the following statements, whenever a scalar appears, the scalar may be restricted to be merely real, even when V is over the complex numbers. • (von Neumann–Jordan condition) The norm satisfies the parallelogram identity. • (weakened von Neumann–Jordan condition) \|x + y\|^2 + \|x - y\|^2 = 4 for all unit vectors x,y. That is, the norm satisfies the parallelogram identity for unit vectors. • For any x, y \in V, the set of points equidistant to x, y is flat, that is, an affine subspace. • Orthogonality in either isosceles or Roberts’ sense is either additive or homogeneous on one variable. • For every two-dimensional subspace W \subset V, for every x \in W, there exists y \in W that is Roberts’ orthogonal to x. • Isosceles orthogonality implies Pythagorean orthogonality. • Pythagorean orthogonality implies isosceles orthogonality. • If x, y are Pythagorean orthogonal, then so are x, -y. • Birkhoff–James orthogonality is symmetric. • If \|x\|=\|y\| and t, s are real, then \|t x+s y\|=\|s x+t y\|. For the real vector space, there is also the condition: • Any two-dimensional slice of the unit sphere is an ellipse, that is, parameterizable as \{x \cos\theta + y \sin\theta : \theta \in [0, 2\pi]\}, for some unit vectors x, y. The Banach-Mazur rotation problem: Given a separable Banach space V such that for any two unit vectors x, y, there exists a linear surjective isometry T such that T(x) = y or T(y) = x, is V isometrically isomorphic to a Hilbert space? The general case of the problem is open. When the space is parable finite-dimensional, the answer is yes. In other words, given a finite-dimensional normed vector space over the real or complex numbers, if any point on the unit sphere can be mapped (rotated) to any other point by a linear isometry, then the norm is induced by an inner product. Symmetric bilinear forms The polarization identities are not restricted to inner products. If B is any symmetric bilinear form on a vector space, and Q is the quadratic form defined by Q(v) = B(v, v), then \begin{align} 2 B(u, v) &= Q(u + v) - Q(u) - Q(v), \\ 2 B(u, v) &= Q(u) + Q(v) - Q(u - v), \\ 4 B(u, v) &= Q(u + v) - Q(u - v). \end{align} The so-called symmetrization map generalizes the latter formula, replacing Q by a homogeneous polynomial of degree k defined by Q(v) = B(v, \ldots, v), where B is a symmetric k-linear map. The formulas above even apply in the case where the field of scalars has characteristic two, though the left-hand sides are all zero in this case. Consequently, in characteristic two there is no formula for a symmetric bilinear form in terms of a quadratic form, and they are in fact distinct notions, a fact which has important consequences in L-theory; for brevity, in this context "symmetric bilinear forms" are often referred to as "symmetric forms". These formulas also apply to bilinear forms on modules over a commutative ring, though again one can only solve for B(u, v) if 2 is invertible in the ring, and otherwise these are distinct notions. For example, over the integers, one distinguishes integral quadratic forms from integral forms, which are a narrower notion. More generally, in the presence of a ring involution or where 2 is not invertible, one distinguishes \varepsilon-quadratic forms and \varepsilon-symmetric forms; a symmetric form defines a quadratic form, and the polarization identity (without a factor of 2) from a quadratic form to a symmetric form is called the "symmetrization map", and is not in general an isomorphism. This has historically been a subtle distinction: over the integers it was not until the 1950s that relation between "twos out" (integral form) and "twos in" (integral form) was understood – see discussion at integral quadratic form; and in the algebraization of surgery theory, Mishchenko originally used L-groups, rather than the correct L-groups (as in Wall and Ranicki) – see discussion at L-theory. Homogeneous polynomials of higher degree Finally, in any of these contexts these identities may be extended to homogeneous polynomials (that is, algebraic forms) of arbitrary degree, where it is known as the polarization formula, and is reviewed in greater detail in the article on the polarization of an algebraic form. == See also ==
tickerdossier.comtickerdossier.substack.com