- Research
- Open Access

# Dependent Gaussian mixture models for source separation

- Alicia Quirós
^{1}Email author and - Simon P Wilson
^{2}

**2012**:239

https://doi.org/10.1186/1687-6180-2012-239

© Quirós and Wilson; licensee Springer. 2012

**Received:**31 January 2012**Accepted:**30 October 2012**Published:**16 November 2012

## Abstract

Source separation is a common task in signal processing and is often analogous to factor analysis. In this study, we look at a factor analysis model for source separation of multi-spectral image data where prior information about the sources and their dependencies is quantified as a multivariate Gaussian mixture model with an unknown number of factors. Variational Bayes techniques for model parameter estimation are used. The development of this methodology is motivated by the need to bring an efficient solution to the separation of components in the microwave radiation maps that are being obtained by the satellite mission Planck which has the objective of uncovering cosmic microwave background radiation. The proposed algorithm successfully incorporates a rich variety of prior information available to us in this problem in contrast to many previous solutions that assume completely blind separation of the sources. Results on realistic simulations of Planck maps and on Wilkinson microwave anisotropy probe fifth year images are shown. The technique suggested is easily applicable to other source separation applications by modifying some of the priors.

## Keywords

- CMB
- Gaussian mixture priors
- Variational Bayes

## Introduction

The discovery of the cosmic microwave background (CMB) is a strong evidence for the Big Bang theory of the formation and development of the universe. According to the theory, the early universe was smaller and hotter but cooled as it expanded. Once the temperature cooled to about 3000 K, photons were free to propagate without being scattered off ionized matter; the CMB is an image of this event and is visible across the entire sky. Three satellites have been launched to measure the CMB: the cosmic background explorer, Wilkinson microwave anisotropy probe (WMAP) and most recently the Planck surveyor. Planck is the highest resolution data to date, of the order of 10^{7} pixels across the sky measured at nine channels.

To date, there have been several attempts to achieve it in a Bayesian framework using both (a) Gaussian mixture model (GMM) prior [3], and (b) Markov Random Field (MRF) prior [4, 5]. Full sky maps at low resolution through MCMC, using masks to reduce the effect of the signal in the galactic plane, were described in [6]. Some of these are fully Bayesian source separation methods which are developed to separate the underlying CMB from the mixed observed signals of extraterrestrial microwaves made at several frequencies.

A common assumption among works in the literature is the independence of the cosmological sources. Although it is well known that CMB is independent from the rest of the sources, the galactic sources demonstrate significant statistical dependence among themselves, as stated in [1]. Recently, a small number of researchers have started addressing this problem [7, 8]. Various dependent component analysis approaches are compared in [9], demonstrating their superior performance with respect to classical ICA.

In this study, we present a dependent components model for source separation of multi-spectral image data, where prior information about the sources and between-source dependencies is quantified as a multivariate GMM, using variational Bayes techniques for model parameter estimation. This article can thus be considered as an extension of [3], modeling dependencies between-sources through generalizing the prior to multivariate GMM.

The rest of the article is structured as follows. The next section gives the model for the mixing problem and describes the hierarchical Bayesian model that we use, including the prior we assume for the sources. Section “Implementing the source separation” describes the variational Bayes approach we use for the implementation of the separation. Section “Examples” provides results on both synthetic Planck and real WMAP images. Finally, we provide a discussion of the results in the last section.

## Model

*n*

_{ f }maps of the sky at frequencies $\left(\right)close="">({\nu}_{1},\dots ,{\nu}_{{n}_{f}})$, each map consisting of

*J*pixels. The data are denoted $\left(\right)close="">{\mathbf{d}}_{j}\in {\mathbb{R}}^{{n}_{f}},j=1,\dots ,J$. The source model consists of

*n*

_{ s }sources and is represented by the vectors $\left(\right)close="">{\mathbf{s}}_{j}\in {\mathbb{R}}^{{n}_{s}}$, with each component representing the amplitude of a physical source of microwaves. We assume that the

**d**

_{ j }can be represented as a linear combination of the

**s**

_{ j }:

**A**is an

*n*

_{ f }×

*n*

_{ s }“mixing” matrix and

**e**

_{ j }is a vector of

*n*

_{ f }independent Gaussian error terms with precisions (inverse variances) $\left(\right)close="">\mathbf{\tau}=({\tau}_{1},\dots ,{\tau}_{{n}_{f}})$. For convenience, define

to represent all data and sources.

We assume dependence between the sources, defined by a prior distribution *p*(**S**|*ψ*) with parameters *ψ*. The goal is to estimate the **S** and the parameters *ψ* associated with the model for **S**, given observation of **D**. The noise variances τ and the mixing matrix **A** are assumed known. GMM are used to represent the non-Gaussian sources, in which case it is an example of a model known as a mixture of factor analyzers [10]. As in [10], we adopt a Bayesian approach to the data fitting, implemented by a variational Bayes approach.

Each element of this distribution is defined next in turn.

### Noise structure

**e**

_{ j }, is assumed independent within and between pixels

*j*and frequency, which gives

where **A**_{i·} is the *i* th row of **A**.

### Mixing matrix structure

In this application, **A** is parameterized and denoted **A**(*θ*). Each column of **A**(*θ*) is the contribution to the observation of a source at different frequencies, which is written as a function of the frequencies and *θ*. These parameterizations are approximations that come from the current state of knowledge about how the sources are generated. Here, we merely state the parameterization that we are going to use, and refer to [11] for a more detailed exposition on the background to them. Some restrictions are usually placed on **A**(*θ*)in order to force a unique solution; this is achieved here by setting the first row of **A**(*θ*) to be ones.

**A**(

*θ*). It is modeled as a black body at a temperature, and its contribution is a known constant at each frequency. The parametrization of the mixing matrix is given as

*T*

_{0}= 2.725K is the average CMB temperature,

*h*is the Planck constant, and

*k*

_{B}is Boltzmann’s constant. The ratio

*g*(

*ν*

_{ i }) /

*g*(

*ν*

_{1}) is designed to ensure that

**A**

_{11}(

*θ*) = 1 as we constraint the first row of

**A**(

*θ*) to be ones.

where *T*_{1} = 18.1K is the assumed thermodynamical temperature of the dust grains, and column 2 corresponds to synchrotron, column 3 to galactic dust, and column 4 is free–free emission. There are three unknown model parameters for **A**, for synchrotron $\left(\right)close="">{\kappa}_{s}\in \{{\kappa}_{s}:-3.0\le {\kappa}_{s}\le -2.3\}$, the spectral indices for dust *κ*_{
d
} ∈ {*κ*_{
d
} : 1 ≤ *κ*_{
d
} ≤ 2}, and for free–free emission $\left(\right)close="">{\kappa}_{f}\in \{{\kappa}_{f}:-2.3\le {\kappa}_{f}\le -2.0\}$.

### The sources

**s**

_{ j }is modeled as a GMM with

*m*factors. The model proposed allows for between-source dependence; the vector of sources at a pixel is a mixture of multivariate Gaussians

for mixture component weights *w*_{
a
}, mean vectors *μ*_{
a
}, and precision matrices *Q*_{
a
}, so that *ψ* is all the *w*_{
a
}, *μ*_{
a
} and *Q*_{
a
}, with *a* = 1,…,*m*. Note that, in the standard inflationary cosmological model the CMB is a single multivariate Gaussian (*m* = 1) while the Galactic foregrounds might require *m* > 1 to be correctly modeled. In order to fulfill this, we set the CMB for components 2,…,*m* of the multivariate mixture to be exactly zero in the implementation.

### Priors

The remaining term in Equation (2) is *p*(*ψ*). We use the conjugate prior distributions [12] that facilitate the computation of the posterior and yet flexible enough to incorporate good prior information: Gaussians for the component means, Dirichlet for the component weights, and Wishart for precision matrices. In the microwave source application, background knowledge about the magnitude of the sources can be incorporated through specifying values of the parameters of these prior distributions. This prior specification follows [13], who discuss how to specify these values in more detail.

## Implementing the source separation

The posterior developed in the previous section does not lend itself to an analytical solution. MCMC techniques are one approach that let us evaluate complicated integrals by sampling rather than by analytical or numerical methods. The main criticism of Bayesian source separation with sampling methods, MCMC in particular, is their computational load and slow convergence. Regarding the speed, they cannot compete with methods such as FastICA [14, 15].

There are several approaches to speed up the algorithm, such as the strategies suggested in [16]. In the image source separation problem framework, the Langevin sampling scheme has been implemented [4], as a way to obtain a faster MC algorithm.

In this study, the source separation model presented in Section “Model” is implemented by a variational Bayesian approach [10, 17, 18], that allows for more efficient inference when dealing with large data when compared with MCMC techniques. In essence, given the data **D** and a model with parameters *θ* and latent variables **Z**, the variational Bayes method is based on approximating the posterior distribution *p*(**Z**,*θ*|**D**) with a factorial approximation *q*(**Z**,*θ*|*ϕ*) = *q*(**Z**|*ϕ*_{
Z
})*q*(*θ*|*ϕ*_{
θ
}), where *ϕ* are the variational parameters. The approximation is fitted by minimizing the Kullback–Leibler divergence between *q* and *p*, or equivalently maximizing a lower bound on marginal log-likelihood of the data.

*Q*

_{ a }; Normal densities for the means,

*μ*

_{ a }; and a Dirichlet for the mixing coefficients,

**p**; and a discrete distribution for the indicator posteriors,

*z*

_{ j }, which indicates the component that explains information in pixel

*j*. We further derived the variational approximation to the marginal posterior of sources,

**s**

_{ j }, which turns out to be a multivariate Gaussian distribution. In brief

where *MVN* stands for multivariate normal distribution and Ψ denotes the digamma function. Note that *q*(*z*_{
j
} = *a*) is the probability that component *a* is responsible for information in pixel *j* in sources, **s**_{
j
}.

**λ**,

**ξ**

_{ a },

*β*

_{ a },

**η**

_{ a }, and

*V*

_{ a }for

*j*= 1,…,

*J*and

*a*= 1,…,

*m*, have the following values:

Computations were carried out using Matlab.

## Examples

### Analysis of simulated data

*κ*

_{ s }= −2.9 and

*κ*

_{ d }= 2.0. Noise precisions were those published by the Planck research team [21]. After exploring several values for

*m*, the number of components in the GMM source model was fixed to be

*m*= 1, as it provided the best fit, taking into account the compromise between fit and number of parameters in the model.

*A*

^{⋆}, which corresponds to CMB. We see from the scatter plot and from comparison with Figure 3 that the reconstruction of CMB is very accurate here. The same is true for the other two sources, as shown in Figure 5.

**Mean estimate of parameters for simulated data**

$\left(\right)close="">\widehat{\mu}=\left(\begin{array}{c}0.020\\ 0.014\\ 0.008\end{array}\right)$ | $\left(\right)close="">\widehat{Q}=\left(\begin{array}{c}4.665\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{2em}{0ex}}\phantom{\rule{1em}{0ex}}0\\ \phantom{\rule{1em}{0ex}}0\phantom{\rule{1em}{0ex}}0.006\phantom{\rule{1em}{0ex}}0.003\\ \phantom{\rule{1em}{0ex}}0\phantom{\rule{1em}{0ex}}0.003\phantom{\rule{1em}{0ex}}0.010\end{array}\phantom{\rule{2em}{0ex}}\phantom{\rule{1em}{0ex}}\right)\times 1{0}^{8}$ |

### Analysis of a WMAP year 5 patch

*m*= 2, following the same reasoning as in the simulation study. Informative priors were placed on the GMM parameters, based on discussions on the expected marginal properties of the sources. Table 2 shows the mean of the mixture parameters of the model. Figure 7 shows the estimated CMB. The result obtained is in agreement with previous work [3], as can be appreciated in Figure 8 that shows an histogram of the differences between the estimated CMB using the approach presented here and the estimated CMB obtained in [3].

**Parameter estimated mean for WMAP data**

| $\left(\right)close="">{\widehat{p}}_{1}=0.12$ | $\left(\right)close="">{\widehat{\mu}}_{1}=\left(\begin{array}{c}-0.012\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.051\\ -0.001\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.017\end{array}\phantom{\rule{0.3em}{0ex}}\right)$ | $\left(\right)close="">{\widehat{Q}}_{1}=\left(\begin{array}{c}5.29\phantom{\rule{1em}{0ex}}0\phantom{\rule{2em}{0ex}}\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{2em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}1.47\phantom{\rule{1em}{0ex}}-0.04\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}-1.60\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}-0.04\phantom{\rule{1em}{0ex}}0.41\phantom{\rule{2em}{0ex}}0.32\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}-1.60\phantom{\rule{1em}{0ex}}0.32\phantom{\rule{2em}{0ex}}3.79\end{array}\phantom{\rule{2em}{0ex}}\phantom{\rule{2em}{0ex}}\right)\times 1{0}^{5}$ |

| $\left(\right)close="">{\widehat{p}}_{2}=0.88$ | $\left(\right)close="">{\widehat{\mu}}_{2}=\left(\begin{array}{c}-0.012\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.029\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.003\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.017\end{array}\phantom{\rule{0.3em}{0ex}}\right)$ | $\left(\right)close="">{\widehat{Q}}_{2}=\left(\begin{array}{c}\phantom{\rule{0.3em}{0ex}}1.45\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.5em}{0ex}}0\phantom{\rule{2em}{0ex}}\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.1em}{0ex}}0\phantom{\rule{2em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.54\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.08\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}-0.20\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.08\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.28\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.26\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}0\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}-0.20\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}0.26\phantom{\rule{1em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.4em}{0ex}}1.09\end{array}\phantom{\rule{2em}{0ex}}\phantom{\rule{3.5em}{0ex}}\right)\times 1{0}^{4}$ |

*d*

_{ jk }with the standardized residuals, with one figure for each frequency

*k*= 1,…,5.

## Conclusion

A fully Bayesian factor analysis algorithm has been presented and applied to a multi-channel image source separation problem, where dependencies between sources are modeled as a multivariate GMM. The algorithm performs very well on simulated Planck data and has been applied to data from WMAP.

In this study, we extend previous approaches [3] by allowing the source priors to be a mixture of multivariate Gaussian distributions for each pixel.

The development of this methodology is motivated by the need to bring an efficient solution to the separation of components in the microwave radiation maps to be obtained by the satellite mission Planck which has the objective of uncovering CMB radiation. The proposed algorithm successfully incorporates a rich variety of prior information available to us in this problem in contrast to most of the previous work that assumes completely blind separation of the sources. Further, the variational approach presented here overcomes the convergence problems of the MCMC stated in [23], when dealing with large datasets such as that will be provided by the satellite mission Planck.

In the analysis of simulated data, the number of components in the GMM source model turned out to be *m* = 1. This means that sources are multivariate Gaussian *a priori*. On the other hand, for real data, the number of components is *m* = 2. In blind source separation problem, identifiability relies on the independence of the sources. In this study, in spite of modeling the sources as Gaussians when *m* = 1, identifiability is obtained because of the prior information which is incorporated to the model, given structure to the mixing matrix.

Another type of dependence is that a source is spatially correlated. Spatial dependence is most conveniently modeled by a Gaussian MRF and some preliminary work on this idea can be found in [5]. Combining with cross source correlations, one might ultimately consider a mixture of multivariate Gaussian MRF as a prior for the sources. Implementing the analysis with such a prior would be a significant challenge computationally; we hypothesize that it will be difficult to derive a well-behaved MCMC approach. Other functional approximations, such as that of [24], offer feasible alternative to computing the posterior distribution in this case.

Finally, although the technique was developed for the astrophysical source separation problem in mind, it is general and it is applicable to other source separation problems as well.

## Declarations

### Acknowledgements

Simon Wilson was supported by the STATICA project, funded by the Principal Investigator program of Science Foundation Ireland, contract number 08/IN.1/I1879. The authors acknowledge the use of the PSM, developed by the Component Separation Working Group (WG2) of the Planck Collaboration.

## Authors’ Affiliations

## References

- Kuruoglu EE: Bayesian source separation for cosmology.
*IEEE Signal Process. Mag*2010, 27: 43-54.View ArticleGoogle Scholar - [http://lambda.gsfc.nasa.gov/product/map/current/m_products.cfm]
- Wilson SP, Kuruoglu EE, Salerno E: Fully Bayesian source separation of astrophysical images modelled by a mixture of Gaussians.
*IEEE J. Sel. Topics Signal Process*2008, 2(5):685-696.View ArticleGoogle Scholar - Kayabol K, Kuruoglu EE, Sanz JL, Sankur B, Salerno E, Herranz D: Adaptive Langevin sampler for separation of t-distribution modelled astrophysical maps.
*IEEE Trans. Image Process*2010, 19(9):2357-2368.MathSciNetView ArticleGoogle Scholar - Kayabol K, Kuruoglu EE, Sankur B: Bayesian separation of images modeled with MRFs using MCMC.
*IEEE Trans. Image Process*2009, 18(5):982-994.MathSciNetView ArticleGoogle Scholar - Dickinson C, Eriksen HK, Banday AJ, Jewell JB, Gorski KM, Huey G, Lawrence CR, O’Dwyer IJ, Wandelt BD: Bayesian component and separation cosmic microwave background estimation for the five-year WMAP temperature data.
*Astrophys. J*2010, 705: 1607-1623.View ArticleGoogle Scholar - Bedini L, Herranz D, Salerno E, Baccigalupi C, Kuruoglu EE, Tonazzini A: Separation of correlated astrophysical sources using multiple-lag data covariance matrices.
*EURASIP J. Appl. Signal Process*2005, 2005(15):2400-2412. 10.1155/ASP.2005.2400View ArticleMATHGoogle Scholar - Bonaldi A, Bedini L, Salerno E, Baccigalupi C, De Zott G: Estimating the spectral indices of correlated astrophysical foregrounds by a second-order statistical approach.
*Monthly Notices R. Astron. Soc*2006, 373: 271-279. 10.1111/j.1365-2966.2006.11025.xView ArticleGoogle Scholar - Kuruoglu EE: Dependent component analysis for cosmology.
*Lecture Notes Comput. Sci*2010, 6365: 538-545. 10.1007/978-3-642-15995-4_67View ArticleGoogle Scholar - Ghahramani Z, Beal M: Variational inference for Bayesian mixtures of factor analysers. In
*Advances in Neural Information Processing Systems*. Edited by: Solla SA, Leen TK, Muller KR. MIT Press, Cambridge, MA); 2000.Google Scholar - Eriksen HK, Dickinson C, Lawrence CR, Baccigalupi C, Banday AJ, Gorski KM, Hansen FK, Lilje PB, Pierpaoli E, Smith KM, Vanderlinde K: C M B component separation by parameter estimation.
*Astrophys. J*2006, 641: 665-682. 10.1086/500499View ArticleGoogle Scholar - Lee PM:
*Bayesian Statistics: An Introduction*. Hodder Arnold H& S, London; 2004.MATHGoogle Scholar - Richardson S, Green P: On Bayesian analysis of mixtures with an unknown number of components (with discussion).
*J. R. Stat. Soc. Ser. B*1997, 59: 731-792. 10.1111/1467-9868.00095MathSciNetView ArticleMATHGoogle Scholar - Hyvärinen A: Fast and robust fixed-point algorithms for independent component analysis.
*IEEE Trans. Neural Netw*1999, 10(3):626-634. 10.1109/72.761722View ArticleGoogle Scholar - Leach S, Cardoso JF, Baccigalupi C, Barreiro R, Betoule M, Bobin J, Bonaldi A, Delabrouille J, De Zotti G, Dickinson C, Eriksen HK, González-Nuevo J, Hansen FK, Herranz D, Le Jeune M, López-Caniego M, Martínez-González E, Massardi M, Melin JB, Miville-Deschênes MA, Patanchon G, Prunet S, Ricciardi S, Salerno E, Sanz JL, Starck JL, Stivoli F, Stolyarov V, Stompor R, Vielva P: Component separation methods for the PLANCK mission.
*Astron. Astrophys*2008, 491(2):597-615. 10.1051/0004-6361:200810116View ArticleGoogle Scholar - Gilks WR, Richardson S, Spiegelhalter DJ:
*Markov Chain Monte Carlo in Practice*. CRC Press, Boca Raton, FL; 1996.MATHGoogle Scholar - McKay DJC:
*Information Theory, Inference and Learning Algorithms*. Cambridge University Press, Cambridge; 2003.Google Scholar - Bishop CM:
*Pattern Recognition and Machine Learning*. Springer, New York; 2006.MATHGoogle Scholar - Attias H:
*A Variational Bayesian Framework for Graphical Models*. MIT Press, Cambridge, MA; 2000.Google Scholar - Collaboration P: The pre-launch plack sky model: a model of sky emission at submillimetre to centimetre wavelengths (in preparation).Google Scholar
- [http://sci.esa.int/science-e/www/object/index.cfm?fobjectid=34730%26fbodylongid=1595]
- [http://map.gsfc.nasa.gov/]
- Wilson SP, Kuruoglu EE, Quirós A: Bayesian factor analysis using Gaussian mixture sources, with application to separation of the cosmic microwave background.
*2nd International Workshop on Cognitive Information Processing*2010.Google Scholar - Rue H, Martino S, Chopin N: Approximate Bayesian inference for latent Gaussian models using integrated nested Laplace approximations.
*J. R. Stat. Soc. Ser. B*2008, 71: 319-392.MathSciNetView ArticleMATHGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.