Author: Eiko

Tags: probability theory, time series, stochastic process, stationary, stationarity

Time: 2025-01-08 10:00:09 - 2025-01-29 01:08:40 (UTC)

Basic Concepts

Let {Xt,tT} be a stochastic process / time series. We have some concepts of covariance matrices to understand the dependence between them.

  • When Var(Xt)<, we can define auto-covariance function

    γX(r,s):=Cov(Xr,Xs)=E[(XrEXr)(XsEXs)],r,sT.

  • Xt is called stationary or weakly stationary / covariance stationary / second order stationary, if

    • The covariance is time-homogeneous, i.e. γX(r,s)=γX(r+h,s+h)

    • The expectation is also time-homogeneous, i.e. EXt=μ for all tT.

    • Additionally, all Xt are in L2.

  • When we are in the stationary setting, the covariance γX(r,s) can be reduced to a single parameter function γX(rs), so

    γX(h)=Cov(Xt+h,Xt).

    And the auto-correlation is then

    ρX(h):=Cov(Xt+h,Xt)Var(Xt+h)Var(Xt)=γX(h)γX(0)γX(0)=γX(h)γX(0).

  • Xt is said to be strictly stationary if the joint distribution of (Xt1,,Xtk) is the same as (Xt1+h,,Xtk+h) for all t1,,tk,hT.