Let \(\{X_t, t\in T\}\) be a stochastic process / time series. We have some concepts of covariance matrices to understand the dependence between them.
When \(\mathrm{Var}(X_t)<\infty\), we can define auto-covariance function
\[\gamma_X(r,s) := \mathrm{Cov}(X_r, X_s) = \mathbb{E}[(X_r - \mathbb{E}X_r)(X_s - \mathbb{E}X_s)], \quad r,s\in T.\]
\(X_t\) is called stationary or weakly stationary / covariance stationary / second order stationary, if
The covariance is time-homogeneous, i.e. \(\gamma_X(r,s) = \gamma_X(r+h, s+h)\)
The expectation is also time-homogeneous, i.e. \(\mathbb{E}X_t = \mu\) for all \(t\in T\).
Additionally, all \(X_t\) are in \(L^2\).
When we are in the stationary setting, the covariance \(\gamma_X(r,s)\) can be reduced to a single parameter function \(\gamma_X(r-s)\), so
\[\gamma_X(h) = \mathrm{Cov}(X_{t+h}, X_t).\]
And the auto-correlation is then
\[\begin{align*} \rho_X(h) &:= \frac{\mathrm{Cov}(X_{t+h}, X_t)}{\sqrt{\mathrm{Var}(X_{t+h})\mathrm{Var}(X_t)}} \\ &= \frac{\gamma_X(h)}{\sqrt{\gamma_X(0)\gamma_X(0)}} \\ &= \frac{\gamma_X(h)}{\gamma_X(0)}. \end{align*}\]
\(X_t\) is said to be strictly stationary if the joint distribution of \((X_{t_1}, \ldots, X_{t_k})\) is the same as \((X_{t_1+h}, \ldots, X_{t_k+h})\) for all \(t_1, \ldots, t_k, h\in T\).