3. Compute H(i) = cov(g(i) ) ( Calculate estimated covariance of GMM coefficients)
4. Compute Hp(i) = [Ho-1 + (H(i))-1]-1 (combining prior variance with data variance
to get posterior variance)
5. Compute gp(i) = Hp(i)[Ho-1go + (H(i))-1g(i)] ( Combine prior mean of coefficients with
moment conditions through the maxent principle to get posterior mean.)
6. Draw γ(i) from MVN(gp(i) , Hp(i) ) (Draw candidate parameters from a multivariate
normal distribution )
7. If γ(i) ∈ R, continue, otherwise go back to step 6 (Satisfy restrictions that impose
economic theory)
8. Compute S(i) from the residuals.
9. Return to step 1 conditioning on new values of all parameters.
In the above, GMM represents an operator to compute a standard GMM estimator
with four arguments representing the y data, the X data, instruments, and fixed covariance
matrix, respectively. To begin this procedure, arbitrary initial values γ(0) and S(0) are
needed; we use GMM estimates for this purpose. Then steps 1 through 8 are repeated in
a loop, each step conditioned on the most recent values of all other parameters and values
in the process. Such a process converges to a random sample from the full joint posterior
distribution as in Chib (1995). For details on performing MCMC with these and other
distributions, see Tanner (1996).
The first 500 draws were discarded to remove dependence on the initial conditions.
We then continued drawing 3,000 more parameter vectors for computation of the posterior
distribution. Computation of the posterior standard deviations proved this number of
draws to be sufficient. To test convergence the posterior means were compared to those
of other runs of the Gibbs sampler and to subsamples of the 3,000 draws from the run
21