MarketHankel matrix
Company Profile

Hankel matrix

In linear algebra, a Hankel matrix, named after Hermann Hankel, is a rectangular matrix in which each ascending skew-diagonal from left to right is constant. For example,

Properties
• Any square Hankel matrix is symmetric. • Let J_n be the n \times n exchange matrix. If H is an m \times n Hankel matrix, then H = T J_n where T is an m \times n Toeplitz matrix. • If T is real symmetric, then H = T J_n will have the same eigenvalues as T up to sign. • The Hilbert matrix is an example of a Hankel matrix. • The determinant of a Hankel matrix is called a catalecticant. ==Hankel operator==
Hankel operator
Given a formal Laurent series f(z) = \sum_{n=-\infty}^N a_n z^n, the corresponding Hankel operator is defined as H_f : \mathbf C[z] \to \mathbf z^{-1} \mathbf Cz^{-1}. This takes a polynomial g \in \mathbf C[z] and sends it to the product fg, but discards all powers of z with a non-negative exponent, so as to give an element in z^{-1} \mathbf Cz^{-1}, the formal power series with strictly negative exponents. The map H_f is in a natural way \mathbf C[z]-linear, and its matrix with respect to the elements 1, z, z^2, \dots \in \mathbf C[z] and z^{-1}, z^{-2}, \dots \in z^{-1}\mathbf Cz^{-1} is the Hankel matrix \begin{bmatrix} a_{-1} & a_{-2} & \ldots \\ a_{-2} & a_{-3} & \ldots \\ a_{-3} & a_{-4} & \ldots \\ \vdots & \vdots & \ddots \end{bmatrix}. Any Hankel matrix arises in this way. A theorem due to Kronecker says that the rank of this matrix is finite precisely if f is a rational function, that is, a fraction of two polynomials f(z) = \frac{p(z)}{q(z)}. ==Approximations==
Approximations
We are often interested in approximations of the Hankel operators, possibly by low-order operators. In order to approximate the output of the operator, we can use the spectral norm (operator 2-norm) to measure the error of our approximation. This suggests singular value decomposition as a possible technique to approximate the action of the operator. Note that the matrix A does not have to be finite. If it is infinite, traditional methods of computing individual singular vectors will not work directly. We also require that the approximation is a Hankel matrix, which can be shown with AAK theory. ==Hankel matrix transform==
Hankel matrix transform
The Hankel matrix transform, or simply Hankel transform, of a sequence b_k is the sequence of the determinants of the Hankel matrices formed from b_k. Given an integer n > 0, define the corresponding (n \times n)-dimensional Hankel matrix B_n as having the matrix elements [B_n]_{i,j} = b_{i+j}. Then the sequence h_n given by h_n = \det B_n is the Hankel transform of the sequence b_k. The Hankel transform is invariant under the binomial transform of a sequence. That is, if one writes c_n = \sum_{k=0}^n {n \choose k} b_k as the binomial transform of the sequence b_n, then one has \det B_n = \det C_n. == Applications of Hankel matrices ==
Applications of Hankel matrices
Hankel matrices are formed when, given a sequence of output data, a realization of an underlying state-space or hidden Markov model is desired. The singular value decomposition of the Hankel matrix provides a means of computing the A, B, and C matrices which define the state-space realization. The Hankel matrix formed from the signal has been found useful for decomposition of non-stationary signals and time-frequency representation. Method of moments for polynomial distributions The method of moments applied to polynomial distributions results in a Hankel matrix that needs to be inverted in order to obtain the weight parameters of the polynomial distribution approximation. Positive Hankel matrices and the Hamburger moment problems ==See also==
tickerdossier.comtickerdossier.substack.com