Gordon Richards edited untitled.tex  over 7 years ago

Commit id: b8564fccd2a2924c474a55adf2b7ae1603d8c33c

deletions | additions      

       

%which require far more parameters to effectively model due to their complicated structure.   Instead, we look to other tools commonly used in time-series analysis.  [GTR: Why is the next paragraph about stationary processes? Sort of odd to segue from a sentence talking about "tools" to stationary.]  Most astronomical time-series obey the properties of a stationary process. %why?  At the most basic level a stationary light curve would be one that has the same mean and variance regardless of where the light curve is sampled. In more detail, a process $X_t, t \in \Z$ is said to be a stationary if (i) $X_t$ has finite variance for all $t$, (ii) the expectation value of $X_t$ is constant for all $t$, and (iii) the autocovariance function, $\gamma_{X}(r,s) = \gamma_{X}(r+t, s+t)$, for all $r,s,t \in \Z$.   %What does this do for us? What are some useful properties relating to ARMA models? 

Property (iii) allows us to define the autocovariance function in a simpler way. $\gamma_{X}(r,s) = \gamma_{X}(r-s, s-s) = \gamma_{X}(r-s,0)$ or substituting $h = (r-s)$, we can now write the autocovariance function of a stationary process as $\gamma_{X}(h) = Cov(X_{t}, X_{t+h})$ representing the autocovariance of $X_{t}$ at some time lag $h$.   %should we talk about white noise processes first? Probably, use the relation we found from the autocovariance function to define it. Then build the ARMA model from there.  The simplestof these  stationary processes is one where the individual observations are independent and identically distributed (IID)  random variables or $IID$. variables.  Such processes, $Z_{t}$, with zero mean and autocovariance function $$\gamma_{Z}(h) =   \begin{cases}  \sigma^{2} & h = 0 \\  0 & h \neq 0 \\  \end {cases} $$  are known as {\em white noise processes} and are written as $Z_{t} \sim WN(0,\sigma^{2})$. [GTR: Need a segue between these two sentences.]  We can use a white noise as a forcing term to build a useful set of linear difference equations to analyze a more complicated time-series. There exist a class of finite difference equations used in the analysis of discrete time series known as autoregressive-moving average (ARMA) processes. These processes allow us to quantify properties of a time-series with a simple but thoroughly descriptive parametric structure. A stationary process $\{X_t\}$ can be modeled by an ARMA(p,q) process if at every time $t$