[go: up one dir, main page]

Skip to main content

Gaussian Mixture Model Selection Using Multiple Random Subsampling with Initialization

  • Conference paper
  • First Online:
Computer Analysis of Images and Patterns (CAIP 2015)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 9256))

Included in the following conference series:

  • 3198 Accesses

Abstract

Selecting optimal number of components in Gaussian Mixture Model (GMM) is of interest to many researchers in the last few decades. Most current approaches are based on information criterion, which was introduced by Akaike (1974) and modified by many other researchers. The standard approach uses the EM algorithm, which fits model parameters to training data and determines log-likelihood functions for increasing number of components. Penalized forms of log-like- lihood function are then used for selecting number of components. Just searching for new or modified forms of penalty function is subject of permanent effort how to improve or do robust these methods for various type of distributed data. Our new technique for selection of optimal number of GMM components is based on Multiple Random Subsampling of training data with Initialization of the EM algorithm (MuRSI). Results of many performed experiments demonstrate the advantages of this method.

This paper was supported by the project no. P103/12/G084 of the Grant Agency of the Czech Republic.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Akaike, H.: A new look at the statistical model identification. IEEE Transactions on Automatic Control AC–19, 716–723 (1974)

    Article  MathSciNet  Google Scholar 

  2. Bulteel, K., Wilderjans, T.F., Tuerlinckx, F., Ceulemans, E.: CHull as an alternative to AIC and BIC in the context of mixtures of factor analyzers. Behav. Res. 45, 782–791 (2013)

    Article  Google Scholar 

  3. Chen, P., Wu, T.J., Yang, J.: A comparative study of model selection criteria for the number of signals. IET Radar Sonar Navig. 2(3), 180–188 (2008)

    Article  Google Scholar 

  4. Huang, T., Peng, H., Zhang, K.: Model Selection for Gaussian Mixture Models. Cornell University Library (2013). http://arxiv.org/abs/1301.3558v1

  5. McLachlan, G., Peel, D.: Finite Mixture Models. John Wiley & Sons, New York (2000)

    Book  MATH  Google Scholar 

  6. Oliver, Ch., Jouzel, F., El Matouat, A.: Choice of the number of component clusters in mixture models by information criteria. In: Vision Interface 1999, Trois-Riveres, Canada (1999)

    Google Scholar 

  7. Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978)

    Article  MATH  Google Scholar 

  8. Shibata, R.: Selection of the order of an autoregressive model by Akaik’s information criterion. Biometrika 52(3), 333–345 (1976)

    Google Scholar 

  9. Vrieze, S.I.: Model selection and psychological theory: A discussion of the differences between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Psychological Methods 17(2), 228–243 (2012)

    Article  Google Scholar 

  10. Xie, C., Chang, J., Liu, Y.: Estimating the Number of Components in Gaussian Mixture Models Adaptively. Journal of Information & Computational Science 10(14), 4453–4460 (2013)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Josef V. Psutka .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Psutka, J.V. (2015). Gaussian Mixture Model Selection Using Multiple Random Subsampling with Initialization. In: Azzopardi, G., Petkov, N. (eds) Computer Analysis of Images and Patterns. CAIP 2015. Lecture Notes in Computer Science(), vol 9256. Springer, Cham. https://doi.org/10.1007/978-3-319-23192-1_57

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-23192-1_57

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-23191-4

  • Online ISBN: 978-3-319-23192-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics