5. References
Beaujean, Frederik and Caldwell, A. Initializing adaptive importance sampling with Markov chains, 2013. arXiv:1304.7808
Bruneau, Pierrick and Gelgon, Marc and Picarougne, Fabien Parsimonious reduction of Gaussian mixture models with a variational-Bayes approach, 2010. Pattern Recognition 43, pp. 850-858 DOI:10.1016/j.patcog.2009.08.006
Bishop, Christopher M. Pattern Recognition and Machine Learning, Springer 2006. ISBN:978-0-387-31073-2
O. Cappé et al. Adaptive importance sampling in general mixture classes, 2010. Stat.Comp. 18, pp. 447–459 DOI:10.1007/s11222-008-9059-x
J.-M. Cornuet et al. Adaptive Multiple Importance Sampling, 2012. Scandinavian Journal of Statistics 39-4, pp. 798-812 DOI:10.1111/j.1467-9469.2011.00756.x
Gelman, Andrew and Rubin, Donald B. Inference from Iterative Simulation Using Multiple Sequences, 1992. Statistical Science, Volume 7, No. 4, pp. 457-511 DOI:10.1214/ss/1177011136
Goldberger, J. and Roweis, S. Hierarchical clustering of a mixture model, 2004. http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2005_63.pdf
Hoogerheide, Lennart and Opschoor, Anne and van Dijk, Herman K. A class of adaptive importance sampling weighted EM algorithms for efficient and robust posterior and predictive simulation, 2012. Journal of Econometrics, Volume 171, Issue 2, pp. 101-120 DOI:10.1016/j.jeconom.2012.06.011
Haario, Heikki and Saksman, Eero and Tamminen, Johanna. An Adaptive Metropolis Algorithm, 2001. jstor:3318737
Kilbinger, Martin and Wraith, Darren and Robert, Christian P. and Benabed, Karim and Cappe, Olivier and others. Bayesian model comparison in cosmology with Population Monte Carlo, 2009. arXiv:0912.1614
Liu, Jun S. and Chen, Rong Blind Deconvolution via Sequential Imputations, 1995. Journal of the American Statistical Association, Volume 90, Number 430, pp. 567–576 DOI:10.1080/01621459.1995.10476549