Thursday, September 10, 2009

Why we need Gibbs sampling

Why we need Gibbs sampling? This is a question I often ask myself when I study the techniques of Gibbs sampling for mixture model. The fundamental problem that Gibbs sampling want to solve is to approximate the posterior of model parameters, . The posterior of model parameters is the essential to understand an unknown model parameterized by . Furthermore, in many situations, the posterior distribution is essential to compute some quantities, which can be expressed as expectations over , such as sufficient statistics, model selection/comparison, Bayesian prediction, etc.


The goal of Gibbs sampling is to draw a set of samples from . To avoid the difficulties of draw samples directly from , Gibbs sampling try to sample from with alternative i, which forms a Markov chain with as its equilibrium distribution. So the key problem to be solved in developing a Gibbs sampling algorithm is to derive   and find the way to draw samples from this distribution: .


No comments:

Post a Comment