problems. Patients come with a wide variety of treatment histories, different geno-
types, family histories, etc. Peptides are related to different biologic functions, have
different interactions, etc. One approach to address this problem is the use of more
flexible prior probability models. In particular, поп-parametric Bayesian models have
been used to generalize parametric models. A technical definition of a non-parametric
Bayesian model p{η) is a probability model that allows η to be infinite dimensional,
like a random probability measure. Let N(χ∙, m, s) denote a normal kernel with mo-
ments (m, s). For example,
p(0) = y^A(0; m,s)dG(m)
is a mixture of normals indexed by the mixing measure G (and a scale s). Here G
is a random probability measure. The model is completed with a hyperprior p{G)
on G. Since the probability measure G is infinite dimensional, this is formally a
non-parametric Bayesian model. Often also the resulting model for θ is referred to
as “non-parametric”. Perhaps the most popular non-parametric Bayesian models are
based on the Dirichlet process (DP) prior p(G) = DP(a,G0)∙ The DP is defined in
Ferguson (1973) and Antoniak (1974). Good recent reviews of such models appear in
Quintana and Müller (2004), and Walker, Damien, Laud and Smith (1999, JRSSB).
The term поп-parametric Bayesian inference could be considered a misnomer, since
the defining property is the exact opposite, an infinite dimensional parameter space.
But the terminology is traditional, and simply motivated by the fact that inference
closely resembles traditional classical non-parametric inference, such as kernel density
estimation. From a data analysis perspective, the infinite dimensional nature of the
parameter space is not critical, and I refer to any similarly flexible probability model
as поп-parametric. In particular, inference based on mixture model generalization of
underlying parametric models are usually considered “non-parametric” inference.