ARMA means autoregressive (AR) moving average (MA).
This is an important class of processes described by finite difference equations.
It has an elegant theory of linear projections.
It is not the classical white noise we are talking about, in this context, a white noise is a stationary process \(\{Z_t\}_{\mathbb{Z}}\) that is componentwise Gaussian, and have covariance (i.e. they don’t have to be independent for this definition).
\[\mathrm{Cov}(Z_s,Z_t) = \sigma^2\delta_{st}.\]
This is expected to generate all the randomness of our system and provide an easy framework for the concept of causuality, similar to what \(\mathcal{F}_t = \sigma(\{Z_i:i\le t\})\) means.
\(\{X_t\}\) is ARMA\((p,q)\) if
\(X\) is stationary
For each \(t\), we have
\[X_t - \phi_1 X_{t-1} - \cdots -\phi_p X_{t-p} = Z_t + \theta_1 Z_{t-1} + \dots + \theta_q Z_{t-q}.\]
One can write this difference equation in operator form, writing \(B\) for the backward shifting operator, we have polynomials \(\phi(z) = \sum \phi_i z^i\) and \(\theta(z) = \sum \theta_i z^i\) with \(\phi_0=\theta_0=1\). We can write the above equation as
\[\phi(B)X_t = \theta(B)Z_t.\]
Example
The MA\((q)\) process is defined by \(\phi(z)=1\) and so
\[X_t = \theta(B) Z_t.\]
We can compute
\[\mathrm{Cov}(X_{t+h},X_t) = \sigma^2 \sum_{j=0}^{q-|h|} \theta_{j+|h|}\theta_j\]
thus it is stationary.
For the ARMA\((p,q)\) process, if \(0\not\in \phi(\partial D(0,1))\), then it has a unique stationary solution given by \(X_t = \sum_{j=0}^\infty \psi_j Z_{t-j}\) where \(\psi(z) = \frac{\theta(z)}{\phi(z)}\), converging on some ring domain \(r^{-1}<|z|<r\).
The solution is causal if \(\psi(z)\) does not involve negative powers of \(z\), only depends on the past.
If \(\phi(D(0,1))\) does not contain \(0\), then there is a causal solution given by the same formula.