1 |
Titterington, D. M. (1984). Recursive parameter estimation using incomplete data, Journal of the Royal Statistical Society: Series B, 46, 257-267.
|
2 |
Yin, H. and Allinson, N. (2001a). Self-organizing mixture networks for probability density estimation, IEEE Transactions on Neural Networks, 12, 405-411.
DOI
ScienceOn
|
3 |
Yin, H. and Allinson, N. (2001b). Bayesian self-organizing map for Gaussian mixtures, IEE Proceedings - Vision, Image, and Signal Processing, 234-240.
|
4 |
Xu, L. and Jordan, M. I. (1996). On convergence properties of the EM algorithm for Gaussian mixtures, Neural Computation, 8, 129-151.
DOI
|
5 |
Ahn, S. and Baik, S. W. (2011). Estimating parameters in multivariate normal mixtures, The Korean Communications in Statistics, 18, 357-366.
DOI
ScienceOn
|
6 |
Kullback, S. and Leibler, R. A. (1951). On information and sufficiency, The Annals of Mathematical Statistics, 22, 79-86.
DOI
|
7 |
Ciuperca, G., Ridolfi, A. and Idier, J. (2003). Penalized maximum likelihood estimator for normal mixtures, Scandinavian Journal of Statistics, 30, 45-59.
DOI
ScienceOn
|
8 |
Dempster, A., Laird, N. and Rubin, D. (1977). Maximum likelihood from incomplete data via the EM algorithm, Journal of the Royal Statistical Society: Series B, 39, 1-38.
|
9 |
Hathaway, R. J. (1985). A constrained formulation of maximum likelihood estimation for normal mixture distributions, Annals of Statistics, 13, 795-800.
DOI
|
10 |
Redner, R. A. and Walker, H. F. (1984). Mixture densities, maximum Likelihood and the EM algorithm, SIAM Review, 26, 195-239.
DOI
ScienceOn
|
11 |
Robbins, H. and Monro, S. (1951). A stochastic approximation method, The Annals of Mathematical Statistics, 22, 400-407.
DOI
|