Time Series Analysis and Control Examples 
The subroutines TSMLOCAR, TSMLOMAR, and TSTVCAR are
used to analyze nonstationary time series models.
The AIC statistic is extensively used
to analyze the locally stationary model.
Locally Stationary AR Model
When the time series is nonstationary, the TSMLOCAR (univariate)
and TSMLOMAR (multivariate) subroutines can be employed.
The whole span of the series is divided into locally
stationary blocks of data, and then the TSMLOCAR and
TSMLOMAR subroutines estimate a stationary AR model by
using the least squares method on this stationary block.
The homogeneity of two different
blocks of data is tested using the AIC.
Given a set of data {y_{1}, ... ,y_{T}}, the data can be
divided into k blocks of sizes t_{1}, ... ,t_{k}, where
t_{1} + ... + t_{k} = T, and k and t_{i} are unknown.
The locally stationary model is fitted to the data
where
where is a Gaussian white
noise with and
.Therefore, the log likelihood function
of the locally stationary series is
Given , j = 0, ... ,p_{i}, the maximum
of the log likelihood function is attained at
The concentrated log likelihood function is given by
Therefore, the maximum likelihood estimates,
and , are
obtained by minimizing the following local SSE:
The least squares estimation of the stationary
model is explained in the section, "Least Squares and
Householder Transformation".
The AIC for the locally stationary model
over the pooled data is written as
where intercept = 1 if the intercept term
is estimated; otherwise, intercept = 0.
The number of stationary blocks (k), the size
of each block (t_{i}), and the order of the
locally stationary model is determined by the AIC.
Consider the autoregressive model fitted over
the block of data, {y_{1}, ... ,y_{T}}, and
let this model M_{1} be an AR(p_{1}) process.
When additional data, {y_{T+1}, ... ,y_{T+T1}},
are available, a new model M_{2}, an AR(p_{2}) process,
is fitted over this new data set, assuming that
these data are independent of the previous data.
Then AICs for models M_{1} and M_{2} are defined as
The joint model AIC for M_{1} and M_{2} is obtained by summation

AIC_{J} = AIC_{1} + AIC_{2}
When the two data sets are pooled and estimated
over the pooled data set, {y_{1}, ... ,y_{T+T1}},
the AIC of the pooled model is
where is the pooled error variance and
p_{A} is the order chosen to fit the pooled data set.
Decision
 If AIC_{J} < AIC_{A}, switch to the new model,
since there is a change in the structure of the time series.
 If , pool the two data sets,
since two data sets are considered to be homogeneous.
If new observations are available, repeat the preceding
steps to determine the homogeneity of the data.
The basic idea of locally stationary AR modeling is
that, if the structure of the time series is not changed,
you should use the additional information to improve
the model fitting, but you need to follow the new
structure of the time series if there is any change.
TimeVarying AR Coefficient Model
Another approach to nonstationary time series,
especially those that are nonstationary in the
covariance, is timevarying AR coefficient modeling.
When the time series is nonstationary in the
covariance, the problem in modeling this series
is related to an efficient parameterization.
It is possible for a Bayesian approach to estimate the model
with a large number of implicit parameters of the complex
structure by using a relatively small number of hyperparameters.
The TSTVCAR subroutine uses smoothness priors by imposing
stochastically perturbed difference equation constraints
on each AR coefficient and frequency response function.
The variance of each AR coefficient distribution constitutes
a hyperparameter included in the state space model.
The likelihood of these hyperparameters is
computed by the Kalman filter recursive algorithm.
The timevarying AR coefficient model is written
where timevarying coefficients
are assumed to change gradually with time.
The following simple stochastic difference
equation constraint is imposed on each coefficient:
The frequency response function of the AR process is written
The smoothness of this function can be measured by
the kth derivative smoothness constraint,
Then the TSTVCAR call imposes zero and
second derivative smoothness constraints.
The timevarying AR coefficients are the solution
of the following constrained least squares:
where , , and are
hyperparameters of the prior distribution.
Using a state space representation, the model is
where
The computation of the likelihood function is straightforward.
See the section, "State Space and Kalman Filter Method"
for the computation method.
Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.