Variational Inference for Bayesian Networks Logo
[Sourceforge project page]
John Winn, January 2004
Overview Tutorial Examples Help
previous    Contents   next

4. Extending the Gaussian model to a Gaussian mixture model

Our aim is to create a Gaussian mixture model and so we must extend our simple Gaussian model to be a mixture with K Gaussian components. As there will now be K sets of the latent variables μ and γ, these are placed in a new plate, called K, whose size is set to 20. We modify the conditional distribution for the x node to be a mixture of dimension K, with each component being Gaussian. The display is then as shown below.

The model is currently incomplete as making x a mixture requires a new discrete Index parent to indicate which component distribution each data point was drawn from. We must therefore create a new node λ, sitting in the N plate, to represent this new discrete latent variable. We also create a node π with a Dirichlet distribution which provides a prior over λ. The completed mixture model is shown below.

Note: If you want to skip constructing this network by hand, it is in the tutorial file called MixtureOfGaussians 2D.xml.