Skip to main content

2019 | OriginalPaper | Buchkapitel

Mixture Initialization Based on Prior Data Visual Analysis

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The initialization is known to be a critical task for running a mixture estimation algorithm. A majority of approaches existing in the literature are related to initialization of the expectation-maximization algorithm widely used in this area. This study focuses on the initialization of the recursive mixture estimation for the case of normal components, where the mentioned methods are not applicable. Its key part is a choice of the initial statistics of normal components. Several initialization techniques based on visual analysis of prior data are discussed. Validation experiments are presented.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Park, B.-J., Zhang, Y., Lord, D. (2010). Bayesian mixture modeling approach to account for heterogeneity in speed data, Transportation Research Part B: Methodological, vol. 44, 5, pp. 662–673. Park, B.-J., Zhang, Y., Lord, D. (2010). Bayesian mixture modeling approach to account for heterogeneity in speed data, Transportation Research Part B: Methodological, vol. 44, 5, pp. 662–673.
2.
Zurück zum Zitat Yoshigoe, K., Dai, W., Abramson, M., Jacobs, A. (2015). Overcoming invasion of privacy in smart home environment with synthetic packet injection, In: TRON Symposium (TRONSHOW), Tokyo, Japan, 2014, pp. 1–7. Yoshigoe, K., Dai, W., Abramson, M., Jacobs, A. (2015). Overcoming invasion of privacy in smart home environment with synthetic packet injection, In: TRON Symposium (TRONSHOW), Tokyo, Japan, 2014, pp. 1–7.
3.
Zurück zum Zitat Yu, J. (2011). Fault detection using principal components-based Gaussian mixture model for semiconductor manufacturing processes, IEEE Transactions on Semiconductor Manufacturing, vol. 24, 3, pp. 432–444. Yu, J. (2011). Fault detection using principal components-based Gaussian mixture model for semiconductor manufacturing processes, IEEE Transactions on Semiconductor Manufacturing, vol. 24, 3, pp. 432–444.
4.
Zurück zum Zitat Kárný, M., Böhm, J., Guy, T. V., Jirsa, L., Nagy, I., Nedoma, P., Tesař, L. (2006). Optimized Bayesian dynamic advising: theory and algorithms, Springer-Verlag London. Kárný, M., Böhm, J., Guy, T. V., Jirsa, L., Nagy, I., Nedoma, P., Tesař, L. (2006). Optimized Bayesian dynamic advising: theory and algorithms, Springer-Verlag London.
5.
Zurück zum Zitat Kárný, M., Kadlec, J., Sutanto, E. L. (1998). Quasi-Bayes estimation applied to normal mixture, in: Preprints of the 3rd European IEEE Workshop on Computer-Intensive Methods in Control and Data Processing (eds. J. Rojíček, M. Valečková, M. Kárný, K. Warwick), CMP’98 /3./, Prague, CZ, pp. 77–82. Kárný, M., Kadlec, J., Sutanto, E. L. (1998). Quasi-Bayes estimation applied to normal mixture, in: Preprints of the 3rd European IEEE Workshop on Computer-Intensive Methods in Control and Data Processing (eds. J. Rojíček, M. Valečková, M. Kárný, K. Warwick), CMP’98 /3./, Prague, CZ, pp. 77–82.
6.
Zurück zum Zitat Gupta, M. R., Chen, Y. (2011). Theory and use of the EM method, in: Foundations and Trends in Signal Processing, vol. 4, 3, pp. 223–296. Gupta, M. R., Chen, Y. (2011). Theory and use of the EM method, in: Foundations and Trends in Signal Processing, vol. 4, 3, pp. 223–296.
7.
Zurück zum Zitat Boldea, O., Magnus, J. R. (2009). Maximum likelihood estimation of the multivariate normal mixture model, Journal Of The American Statistical Association, vol. 104, 488, pp. 1539–1549. Boldea, O., Magnus, J. R. (2009). Maximum likelihood estimation of the multivariate normal mixture model, Journal Of The American Statistical Association, vol. 104, 488, pp. 1539–1549.
8.
Zurück zum Zitat Wang, H. X., Luo, B., Zhang, Q. B., Wei, S. (2004). Estimation for the number of components in a mixture model using stepwise split-and-merge EM algorithm, Pattern Recognition Letters, vol. 25, 16, pp. 1799–1809. Wang, H. X., Luo, B., Zhang, Q. B., Wei, S. (2004). Estimation for the number of components in a mixture model using stepwise split-and-merge EM algorithm, Pattern Recognition Letters, vol. 25, 16, pp. 1799–1809.
9.
Zurück zum Zitat McGrory, C. A., Titterington, D. M. (2009). Variational Bayesian analysis for hidden Markov models, Australian & New Zealand Journal of Statistics, vol. 51, pp. 227–244. McGrory, C. A., Titterington, D. M. (2009). Variational Bayesian analysis for hidden Markov models, Australian & New Zealand Journal of Statistics, vol. 51, pp. 227–244.
10.
Zurück zum Zitat Šmídl, V., Quinn, A. (2006). The Variational Bayes method in signal processing, Springer-Verlag Berlin Heidelberg. Šmídl, V., Quinn, A. (2006). The Variational Bayes method in signal processing, Springer-Verlag Berlin Heidelberg.
11.
Zurück zum Zitat Frühwirth-Schnatter, S. (2006). Finite mixture and Markov switching models, Springer-Verlag New York, 2006. Frühwirth-Schnatter, S. (2006). Finite mixture and Markov switching models, Springer-Verlag New York, 2006.
12.
Zurück zum Zitat Doucet, A., Andrieu, C. (2001). Iterative algorithms for state estimation of jump Markov linear systems, IEEE Transactions on Signal Processing, vol. 49, 6, pp. 1216–1227. Doucet, A., Andrieu, C. (2001). Iterative algorithms for state estimation of jump Markov linear systems, IEEE Transactions on Signal Processing, vol. 49, 6, pp. 1216–1227.
13.
Zurück zum Zitat Chen, R., Liu, J. S. (2000). Mixture Kalman filters, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 62, pp. 493–508. Chen, R., Liu, J. S. (2000). Mixture Kalman filters, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 62, pp. 493–508.
14.
Zurück zum Zitat Aggarwal, C. Ch. (2015). Outlier Analysis. In: Data Mining: The Textbook, Springer International Publishing, pp. 237–263. Aggarwal, C. Ch. (2015). Outlier Analysis. In: Data Mining: The Textbook, Springer International Publishing, pp. 237–263.
15.
Zurück zum Zitat Fukunaga, K. (2013). Introduction to Statistical Pattern Recognition, series Computer science and scientific computing, Elsevier Science. Fukunaga, K. (2013). Introduction to Statistical Pattern Recognition, series Computer science and scientific computing, Elsevier Science.
16.
Zurück zum Zitat Peterka, V. (1981). Bayesian system identification. In: Trends and Progress in System Identification (ed. P. Eykhoff), Oxford, Pergamon Press, 1981, pp. 239–304. Peterka, V. (1981). Bayesian system identification. In: Trends and Progress in System Identification (ed. P. Eykhoff), Oxford, Pergamon Press, 1981, pp. 239–304.
17.
Zurück zum Zitat Nagy, I., Suzdaleva, E., Kárný, M., Mlynářová, T. (2011). Bayesian estimation of dynamic finite mixtures, Int. Journal of Adaptive Control and Signal Processing, vol. 25, 9, pp. 765–787. Nagy, I., Suzdaleva, E., Kárný, M., Mlynářová, T. (2011). Bayesian estimation of dynamic finite mixtures, Int. Journal of Adaptive Control and Signal Processing, vol. 25, 9, pp. 765–787.
18.
Zurück zum Zitat Melnykov, V., Melnykov, I. (2012). Initializing the EM algorithm in Gaussian mixture models with an unknown number of components, Computational Statistics & Data Analysis, vol. 56, 6, pp. 1381–1395. Melnykov, V., Melnykov, I. (2012). Initializing the EM algorithm in Gaussian mixture models with an unknown number of components, Computational Statistics & Data Analysis, vol. 56, 6, pp. 1381–1395.
19.
Zurück zum Zitat Kwedlo, W. (2013). A new method for random initialization of the EM algorithm for multivariate Gaussian mixture learning, in: Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013, (eds. R. Burduk, K. Jackowski, M. Kurzynski, M. Wozniak, A. Zolnierek), Springer International Publishing, Heidelberg, pp. 81–90. Kwedlo, W. (2013). A new method for random initialization of the EM algorithm for multivariate Gaussian mixture learning, in: Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013, (eds. R. Burduk, K. Jackowski, M. Kurzynski, M. Wozniak, A. Zolnierek), Springer International Publishing, Heidelberg, pp. 81–90.
21.
Zurück zum Zitat Ning, H., Yuxiao Hu, Y., Huang, T. (2008). Efficient initialization of mixtures of experts for human pose estimation, in: Proceedings of the IEEE International Conference on Image Processing. Ning, H., Yuxiao Hu, Y., Huang, T. (2008). Efficient initialization of mixtures of experts for human pose estimation, in: Proceedings of the IEEE International Conference on Image Processing.
22.
Zurück zum Zitat Paclik, P., Novovičová, J. (2001). Number of components and initialization in Gaussian mixture model for pattern recognition, in: Proceedings of the 5th International Conference on Artificial Neural Networks and Genetic Algorithms, ICANNGA 2001, Prague, Czech Republic, pp. 406–409. Paclik, P., Novovičová, J. (2001). Number of components and initialization in Gaussian mixture model for pattern recognition, in: Proceedings of the 5th International Conference on Artificial Neural Networks and Genetic Algorithms, ICANNGA 2001, Prague, Czech Republic, pp. 406–409.
23.
Zurück zum Zitat Kárný, M., Nedoma, P., Khailova, N., Pavelková, L. (2003). Prior information in structure estimation, IEE Proceedings, Control Theory and Applications, vol. 150, 6, pp. 643–653. Kárný, M., Nedoma, P., Khailova, N., Pavelková, L. (2003). Prior information in structure estimation, IEE Proceedings, Control Theory and Applications, vol. 150, 6, pp. 643–653.
24.
Zurück zum Zitat Suzdaleva, E., Nagy, I., Mlynářová, T. Expert-based initialization of recursive mixture estimation. In Proceedings of 2016 IEEE 8th International Conference on Intelligent Systems IS’2016, p. 308–315, Sofia, Bulgaria, September 4–6, 2016. Suzdaleva, E., Nagy, I., Mlynářová, T. Expert-based initialization of recursive mixture estimation. In Proceedings of 2016 IEEE 8th International Conference on Intelligent Systems IS’2016, p. 308–315, Sofia, Bulgaria, September 4–6, 2016.
25.
Zurück zum Zitat Kerridge, D. (1961). Inaccuracy and inference, Journal of Royal Statistical Society B, vol. 23, pp. 284–294. Kerridge, D. (1961). Inaccuracy and inference, Journal of Royal Statistical Society B, vol. 23, pp. 284–294.
Metadaten
Titel
Mixture Initialization Based on Prior Data Visual Analysis
verfasst von
Evgenia Suzdaleva
Ivan Nagy
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-319-78931-6_3