## Ergodic stationary process

Stationarity is a very important concept in stochastic processes analysis. However, the notion itself is not intuitive to understanding. Additionally, several online lectures and papers have been noticed possibly incorrect in their definitions, or unclear in descriptions. Herein, some tips offered in this article are hopefully useful for readers.

Let’s forget all definitions you’ve ever heard about stationarity, or ergodicity. Just focus on the following relationships. You don’t need to truly understand it now. Later, we will discuss with illustrative examples:

**Strictly stationary does not necessarily imply weak stationarity.
Of course, weak stationarity does not necessarily imply Strictly stationarity.
Ergodic processes are stationary.
A stationary process is not necessarily ergodic.**

First, let’s start from the figure below:

In figure above it is showing a stochastic process, which has `n`

realizations `{X_1(t), X_2(t), … , X_n(t)}`

. As it is indicated in the right upper corner, the stochastic process can be characterized in time domain and amplitude domain, corresponding to horizontal and vertical axes, respectively.

*Definition 1**The time series X_t is said to be strictly stationary if for any finite sequence of integers t_1, …, t_k and shift h the distribution of (X_{t_1}, … , X_{t_k}) and X_{t_1+h}, … ,X_{t_k+h} are the same.*

To understand this, for example, we have two time points – `t=t_0`

and `t=t_0+h`

, the joint distribution of `f(X_{t_0}, X_{t_0+h})`

can be obtained based on amplitude observations `(X_1(t_0), X_2(t_0), …, X_n(t_0))`

and `(X_1(t_0+h), X_2(t_0+h), …, X_n(t_0+h))`

. Under the assumption of strictly stationary, it requires this joint distribution is the same everywhere regardless of `t_0`

values. In practice, this rather strong assumption is oftentimes hard to check by the given data.

Instead of strictly stationarity, we have another stationarity assumption.

*Definition 2**The time series {X_t} is said to be weak stationary (or second order stationary) if*

`E[X_t]=\mu`

does not depend on`t`

`var[X_t]=\sigma^2`

does not depend on`t`

`cov(X_t,X_{t+h})`

exists, is finite, and depends only on`h`

but not on`t`

for all possible`h`

.

The most obvious difference between strictly and weak stationarity is the former requires the whole joint distribution is the same for all `t`

, while the latter only ask the first and second moments keep unchange.

(1) A simple example of stationary in every sense is given below (independent white noise):

```
import numpy as np
import matplotlib.pyplot as plt
sampleMean = 0
sampleStd = 1
num_samples = 100
normSamples = np.random.normal(sampleMean, sampleStd, size=num_samples)
plt.plot(normSamples)
plt.show()
```

(2) An example of weakly stationary but not strictly stationary:

Assume,

`X_t=cos(tw)(t=1,2,…)`

Since `E[X_t]=0`

, `Var[X_t]=1/2`

and `Cov(X_{t_1}, X_{t_2})=0`

for `t_1 \neq t_2`

, `X_t`

is weakly stationary. However, the process is not strictly stationary. Values in the sequence are clearly dependent, and the joint distribution `f(X_{t_1}, X_{t_2})`

will depend not only on relative but also on absolute positions of `X_{t_1}`

and `X_{t_2}`

.

(3) Example that strictly stationary does not necessarily imply weakly stationary:

An i.i.d. process with standard Cauchy distribution is strictly stationary but not weakly stationary because the second moment of the process is not finite.

```
cauchySamples = np.random.standard_cauchy(size=num_samples)
plt.plot(cauchySamples)
plt.show()
```

(4) If a process is strictly stationary and the second moment **is finite**, then it is weakly stationary.

(5) **In the case of Gaussian stochastic process, the two definitions of stationarity are equivalent.**

### Ergodicity

The full definition of ergodicity is quite complex and is rarely discussed in stochastic analysis (see Billingsley, 1994, page 312-314). However, one consequence of ergodicity – the ergodic theorem, is useful in stochastic process analysis, and states that if `{X_t}`

is an ergodic stationary process then,

`\frac{1}{n}\sum_{t=1}^{n}g[X(t)]\xrightarrow{a.s.}E[g[X(t_0)]]`

And for any shift `\tau_1,…,\tau_k`

, we have

`\frac{1}{n}\sum_{t=1}^{n}g[X(t),X(t+\tau_1),…,X(t+\tau_k)]\xrightarrow{a.s.}E[g[X(t_0),X(t_0+\tau_1),…,X(t_0+\tau_k)]]`

An intuitive understanding of above formula is, with finite persistence, we can estimate our sample moments. Additionally, with sufficiently long time, average of time domain over one realization will converge to the average of amplitude domain at one time point over all realizations (horizontal average will converge to vertical average in the first figure).

Consider now a situation which is stationary with respect to `X(t)`

. This means that there is no preferred origin in time. Thus, with large time interval in one realization, each sections should then constitute as a good representative ensemble of the statistical behavior of `X(t)`

.

### Ergodic stationary cases

(1) As we discussed, independent white noise are stationary in every sense, it is also ergodic stationary process:

```
ensemble = []
num_realizations = 100
for i in range(num_realizations):
realization = np.random.normal(sampleMean, sampleStd, size=num_samples)
ensemble.append(realization)
for j in range(len(ensemble)):
plt.plot(ensemble[j])
plt.show()
```

(2) Case that a stationary process that is not ergodic. Consider that `\{\varepsilon_t\}`

are i.i.d. random variables and $$Z$$ is a Bernoulli random variable with outcomes `\{0, 1\}`

(where the chance of either outcome is half). Suppose that `Z`

stays the same for all `t`

. Define

`X_t= \begin{cases} \mu_1+\varepsilon_t, & Z=0 \\ \mu_2+\varepsilon_t, & Z=1 \end{cases}`

It is clear that `E(X_t)=\frac{1}{2}(\mu_1+\mu_2)`

. This is a stationary sequence. However, for each realization, that `\frac{1}{T}\sum_{t=1}^{T}X(t)`

will only converge to one of the means, rather than `\frac{1}{2}(\mu_1+\mu_2)`

.

Hence this is a stationary but not ergodic process.

```
from scipy.stats import bernoulli
# generate random numbers
p = 0.5
r = bernoulli.rvs(p, size=num_realizations)
# mean
mu_1 = 5
mu_2 = -5
ensemble = []
for i in range(len(r)):
noise = np.random.normal(sampleMean, sampleStd, size=num_samples)
if r[i] == 0:
realization = noise + mu_1
else:
realization = noise + mu_2
ensemble.append(realization)
for j in range(len(ensemble)):
plt.plot(ensemble[j])
plt.show()
```

### References

*Subba Rao, S. (2017) A Course in Time Series Analysis. Lecture notes**Medina-Cetina, Z. (2017) Stochastic Mechanics. Lecture notes**P. Brockwell and R. Davis. (1998) Time Series: Theory and Methods. Springer, New York*

Content on this site is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License

hotshot bald copWonderful views on that!

XMC.plThanks for taking the time to discuss this, I feel strongly about it and adore finding out far more on this topic. If achievable, as you gain knowledge, would you mind updating your weblog with far more facts? Its extremely helpful for me.