Date of Degree
PhD (Doctor of Philosophy)
John F. Geweke
This work offers two strategies to raise the prediction accuracy of Vector Autoregressive (VAR) Models. The first strategy is to improve the Minnesota prior, which is frequently used for Bayesian VAR models. The improvement is achieved in two ways. First, the variance-covariance matrix of regression disturbances is treated as unknown and random to incorporate parameter uncertainty. Second, the prior variance-covariance matrix of regression coefficients is constructed as a function of the variance-covariance matrix of disturbances, in order to account for dependencies between different equations. Since different prior specifications unavoidably lead to different models, and forecasting capability of any such model is often limited, the second strategy is to build an optimal prediction pool of models by using the conventional log predictive score function. The effectiveness of the proposed strategies is examined for one-step-ahead, multi-4-step-ahead, and single-4-step-ahead predictions through two exercises. One exercise is predicting national output, inflation, and interest rate in the United States, and the other is predicting state tax revenue and personal income in Iowa. The empirical results indicate that a properly selected prior can improve the prediction performance of a BVAR model, and that a real-time optimal prediction pool can outperform a single best constituent model alone.
Bayesian, Minnesota Prior, Optimal Prediction Pool, Vector Autogregression
Copyright 2010 Weijie Mao