Mixmod

Model-based cluster and discriminant analysis with the MIXMOD software The Mixture Modeling (MIXMOD) program fits mixture models to a given data set for the purposes of density estimation, clustering or discriminant analysis. A large variety of algorithms to estimate the mixture parameters are proposed (EM, Classification EM, Stochastic EM), and it is possible to combine these to yield different strategies for obtaining a sensible maximum for the likelihood (or complete-data likelihood) function. MIXMOD is currently intended to be used for multivariate Gaussian mixtures, and fourteen different Gaussian models can be distinguished according to different assumptions regarding the component variance matrix eigenvalue decomposition. Moreover, different information criteria for choosing a parsimonious model (the number of mixture components, for instance) are included, their suitability depending on the particular perspective (cluster analysis or discriminant analysis). Written in C++, MIXMOD is interfaced with SCILAB and MATLAB. The program, the statistical documentation and the user guide are available on the internet at the following address: http://www-math.univ-fcomte.fr/mixmod/index.php. (Source: https://www.projet-plume.org/en/relier/mixmod)


References in zbMATH (referenced in 20 articles )

Showing results 1 to 20 of 20.
Sorted by year (citations)

  1. Andrews, Jeffrey L.; McNicholas, Paul D.: Variable selection for clustering and classification (2014)
  2. Biernacki, Christophe; Lourme, Alexandre: Stable and visualizable Gaussian parsimonious clustering models (2014)
  3. Browne, Ryan P.; McNicholas, Paul D.: Orthogonal Stiefel manifold optimization for eigen-decomposed covariance parameter estimation in mixture models (2014)
  4. Gollini, Isabella; Murphy, Thomas Brendan: Mixture of latent trait analyzers for model-based clustering of categorical data (2014)
  5. Galimberti, Giuliano; Soffritti, Gabriele: Using conditional independence for parsimonious model-based Gaussian clustering (2013)
  6. Lourme, A.; Biernacki, C.: Simultaneous Gaussian model-based clustering for samples of multiple origins (2013)
  7. Schwander, Olivier; Nielsen, Frank: Learning mixtures by simplifying kernel density estimators (2013)
  8. Baudry, Jean-Patrick; Maugis, Cathy; Michel, Bertrand: Slope heuristics: overview and implementation (2012)
  9. Lee, Gyemin; Scott, Clayton: EM algorithms for multivariate Gaussian mixture models with truncated and censored data (2012)
  10. Lee, Paul H.; Yu, Philip L.H.: Mixtures of weighted distance-based models for ranking data with applications in political studies (2012)
  11. Maugis, Cathy; Michel, Bertrand: A non asymptotic penalized criterion for Gaussian mixture model selection (2011)
  12. Maugis, Cathy; Michel, Bertrand: Data-driven penalty calibration: a case study for Gaussian mixture model selection (2011)
  13. Maugis, C.; Celeux, G.; Martin-Magniette, M.-L.: Variable selection in model-based discriminant analysis (2011)
  14. Scrucca, Luca: Model-based SIR for dimension reduction (2011)
  15. Biernacki, C.; Celeux, G.; Govaert, G.: Exact and Monte Carlo calculations of integrated likelihoods for the latent class model (2010)
  16. Melnykov, Volodymyr; Maitra, Ranjan: Finite mixture models and model-based clustering (2010)
  17. Maugis, Cathy; Celeux, Gilles; Martin-Magniette, Marie-Laure: Variable selection for clustering with Gaussian mixture models (2009)
  18. Maugis, C.; Celeux, G.; Martin-Magniette, M.-L.: Variable selection in model-based clustering: a general variable role modeling (2009)
  19. Schlattmann, Peter: Medical applications of finite mixture models (2009)
  20. Biernacki, Christophe; Celeux, Gilles; Govaert, Gérard; Langrognet, Florent: Model-based cluster and discriminant analysis with the MIXMOD software (2006)


Further publications can be found at: http://www.mixmod.org/article.php3?id_article=16