MarketDonsker's theorem
Company Profile

Donsker's theorem

In probability theory, Donsker's theorem, named after Monroe D. Donsker, is a functional extension of the central limit theorem for empirical distribution functions. Specifically, the theorem states that an appropriately centered and scaled version of the empirical distribution function converges to a Gaussian process.

Formal statement
Let Fn be the empirical distribution function of the sequence of i.i.d. random variables X_1, X_2, X_3, \ldots with distribution function F. Define the centered and scaled version of Fn by : G_n(x)= \sqrt n ( F_n(x) - F(x) ) indexed by x ∈ R. By the classical central limit theorem, for fixed x, the random variable Gn(x) converges in distribution to a Gaussian (normal) random variable G(x) with zero mean and variance F(x)(1 − F(x)) as the sample size n grows. Theorem (Donsker, Skorokhod, Kolmogorov) The sequence of Gn(x), as random elements of the Skorokhod space \mathcal{D}(-\infty,\infty), converges in distribution to a Gaussian process G with zero mean and covariance given by : \operatorname{cov}[G(s), G(t)] = E[G(s) G(t)] = \min\{F(s), F(t)\} - F(s){F}(t). The process G(x) can be written as B(F(x)) where B is a standard Brownian bridge on the unit interval. Proof sketch For continuous probability distributions, it reduces to the case where the distribution is uniform on [0, 1] by the inverse transform. Given any finite sequence of times 0 , we have that N F_N(t_1) is distributed as a binomial distribution with mean Nt_1 and variance Nt_1(1-t_1) . Similarly, the joint distribution of F_N(t_1), F_N(t_2), \dots, F_N(t_n) is a multinomial distribution. Now, the central limit approximation for multinomial distributions shows that \lim_N \sqrt N (F_N(t_i) - t_i) converges in distribution to a gaussian process with covariance matrix with entries \min(t_i, t_j) - t_i t_j , which is precisely the covariance matrix for the Brownian bridge. == History and related results ==
History and related results
Kolmogorov (1933) showed that when F is continuous, the supremum \scriptstyle\sup_t G_n(t) and supremum of absolute value, \scriptstyle\sup_t |G_n(t)| converges in distribution to the laws of the same functionals of the Brownian bridge B(t), see the Kolmogorov–Smirnov test. In 1949 Doob asked whether the convergence in distribution held for more general functionals, thus formulating a problem of weak convergence of random functions in a suitable function space. a general extension for the Doob–Kolmogorov heuristic approach. In the original paper, Donsker proved that the convergence in law of Gn to the Brownian bridge holds for Uniform[0,1] distributions with respect to uniform convergence in t over the interval [0,1].{{cite journal |first=M. D. |last=Donsker |author-link=Monroe D. Donsker |title=Justification and extension of Doob's heuristic approach to the Kolmogorov–Smirnov theorems |journal=Annals of Mathematical Statistics |volume=23 |issue= 2|pages=277–281 |year=1952 |doi=10.1214/aoms/1177729445 |mr=47288 | zbl = 0046.35103 However Donsker's formulation was not quite correct because of the problem of measurability of the functionals of discontinuous processes. In 1956 Skorokhod and Kolmogorov defined a separable metric d, called the Skorokhod metric, on the space of càdlàg functions on [0,1], such that convergence for d to a continuous function is equivalent to convergence for the sup norm, and showed that Gn converges in law in \mathcal{D}[0,1] to the Brownian bridge. Later Dudley reformulated Donsker's result to avoid the problem of measurability and the need of the Skorokhod metric. One can prove that there exist Xi, iid uniform in [0,1] and a sequence of sample-continuous Brownian bridges Bn, such that :\|G_n-B_n\|_\infty is measurable and converges in probability to 0. An improved version of this result, providing more detail on the rate of convergence, is the Komlós–Major–Tusnády approximation. ==See also==
tickerdossier.comtickerdossier.substack.com