Definition A weaker form of stationarity commonly employed in
signal processing is known as
weak-sense stationarity,
wide-sense stationarity (WSS), or
covariance stationarity. WSS random processes only require that 1st
moment (i.e. the mean) and
autocovariance do not vary with respect to time and that the 2nd moment is finite for all times. Any strictly stationary process that has a finite
mean and
covariance is also WSS. So, a
continuous time random process \left\{X_t\right\} that is WSS has the following restrictions on its mean function m_X(t) \triangleq \operatorname E[X_t] and
autocovariance function {{tmath| K_{XX}(t_1, t_2) \triangleq \operatorname E[(X_{t_1}-m_X(t_1))(X_{t_2}-m_X(t_2))] }}: {{equation box 1 \begin{align} & m_X(t) = m_X(t + \tau) & & \text{for all } \tau,t \in \mathbb{R} \\ & K_{XX}(t_1, t_2) = K_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & \operatorname E[|X_t|^2] |}} The first property implies that the mean function m_X(t) must be constant. The second property implies that the autocovariance function depends only on the
difference between t_1 and t_2 and only needs to be indexed by one variable rather than two variables. Thus, instead of writing, : \,\!K_{XX}(t_1 - t_2, 0)\, the notation is often abbreviated by the substitution : : K_{XX}(\tau) \triangleq K_{XX}(t_1 - t_2, 0) This also implies that the
autocorrelation depends only on , that is : R_X(t_1,t_2) = R_X(t_1-t_2,0) \triangleq R_X(\tau). The third property says that the second moments must be finite for any time .
Motivation The main advantage of wide-sense stationarity is that it places the time-series in the context of
Hilbert spaces. Let be the Hilbert space generated by {{tmath| \{ x(t) \} }} (that is, the closure of the set of all linear combinations of these random variables in the Hilbert space of all square-integrable random variables on the given probability space). By the positive definiteness of the autocovariance function, it follows from
Bochner's theorem that there exists a positive measure \mu on the real line such that is isomorphic to the Hilbert subspace of generated by {{tmath| \{ e^{-2\pi i \xi \cdot t} \} }}. This then gives the following Fourier-type decomposition for a continuous time stationary stochastic process: there exists a stochastic process \omega_\xi with
orthogonal increments such that, for all : : X_t = \int e^{- 2 \pi i \lambda \cdot t} \, d \omega_\lambda, where the integral on the right-hand side is interpreted in a suitable (Riemann) sense. The same result holds for a discrete-time stationary process, with the spectral measure now defined on the unit circle. When processing WSS random signals with
linear,
time-invariant (
LTI)
filters, it is helpful to think of the correlation function as a
linear operator. Since it is a
circulant operator (depends only on the difference between the two arguments), its eigenfunctions are the
Fourier complex exponentials. Additionally, since the
eigenfunctions of LTI operators are also
complex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in the
frequency domain. Thus, the WSS assumption is widely employed in signal processing
algorithms.
Definition for complex stochastic process In the case where \left\{X_t\right\} is a complex stochastic process the
autocovariance function is defined as K_{XX}(t_1, t_2) = \operatorname E[(X_{t_1}-m_X(t_1))\overline{(X_{t_2}-m_X(t_2))}] and, in addition to the requirements in , it is required that the pseudo-autocovariance function J_{XX}(t_1, t_2) = \operatorname E[(X_{t_1}-m_X(t_1))(X_{t_2}-m_X(t_2))] depends only on the time lag. In formulas, \left\{X_t\right\} is WSS, if {{equation box 1 \begin{align} & m_X(t) = m_X(t + \tau) & & \text{for all } \tau,t \in \mathbb{R} \\ & K_{XX}(t_1, t_2) = K_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & J_{XX}(t_1, t_2) = J_{XX}(t_1 - t_2, 0) & & \text{for all } t_1,t_2 \in \mathbb{R} \\ & \operatorname E[|X(t)|^2] |}} == Joint stationarity ==