Chapter Contents |
Previous |
Next |

Forecasting Process Details |

This section explains the goodness-of-fit statistics reported to measure how well different models fit the data. The statistics of fit for the various forecasting models can be viewed or stored in a data set using the Model Viewer window.

The various statistics of fit reported are as follows.
In these formula,
*n* is the number of nonmissing observations and
*k* is the number of fitted parameters in the model.

*Number of Nonmissing Observations.*

The number of nonmissing observations used to fit the model.*Number of Observations.*

The total number of observations used to fit the model, including both missing and nonmissing observations.*Number of Missing Actuals.*

The number of missing actual values.*Number of Missing Predicted Values.*

The number of missing predicted values.*Number of Model Parameters.*

The number of parameters fit to the data. For combined forecast, this is the number of forecast components.*Total Sum of Squares (Uncorrected).*

The total sum of squares for the series, SST, uncorrected for the mean: .*Total Sum of Squares (Corrected).*

The total sum of squares for the series, SST, corrected for the mean: ,where is the series mean.*Sum of Square Errors.*

The sum of the squared prediction errors, SSE. ,where is the one-step predicted value.*Mean Square Error.*

The mean squared prediction error, MSE, calculated from the one-step-ahead forecasts.*MSE*= [1/*n*]*SSE*. This formula enables you to evaluate small holdout samples.*Root Mean Square Error.*

The root mean square error (RMSE), .*Mean Absolute Percent Error.*

The mean absolute percent prediction error (MAPE), .

The summation ignores observations where*y*_{t}= 0.*Mean Absolute Error.*

The mean absolute prediction error, .*R-Square.*

The*R*statistic,^{2}*R*= 1-^{2}*SSE*/*SST*. If the model fits the series badly, the model error sum of squares,*SSE*, may be larger than*SST*and the*R*statistic will be negative.^{2}*Adjusted R-Square.*

The adjusted*R*statistic, 1 - ([(^{2}*n*-1)/(*n*-*k*)]) (1-*R*).^{2}*Amemiya's Adjusted R-Square.*

Amemiya's adjusted*R*, 1 - ([(^{2}*n*+*k*)/(*n*-*k*)]) (1 -*R*).^{2}*Random Walk R-Square.*

The random walk*R*statistic (Harvey's^{2}*R*statistic using the random walk model for comparison), 1 - ([(^{2}*n*-1)/*n*])*SSE*/*RWSSE*, where ,and .*Akaike's Information Criterion.*

Akaike's information criterion (AIC),*n*ln(*MSE*) + 2*k*.*Schwarz Bayesian Information Criterion.*

Schwarz Bayesian information criterion (SBC or BIC),

*n*ln(*MSE*) +*k*ln(*n*).*Amemiya's Prediction Criterion.*

Amemiya's prediction criterion, [1/*n*]*SST*([(*n*+*k*)/(*n*-*k*)])(1-*R*) = ([(^{2}*n*+*k*)/(*n*-*k*)]) [1/*n*]*SSE*.*Maximum Error.*

The largest prediction error.*Minimum Error.*

The smallest prediction error.*Maximum Percent Error.*

The largest percent prediction error, .The summation ignores observations where*y*_{t}= 0.*Minimum Percent Error.*

The smallest percent prediction error, .The summation ignores observations where*y*_{t}= 0.*Mean Error.*

The mean prediction error, .*Mean Percent Error.*

The mean percent prediction error, .The summation ignores observations where*y*_{t}= 0.

Chapter Contents |
Previous |
Next |
Top |

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.