Skip to main content

2017 | OriginalPaper | Buchkapitel

Gaussian Mixture Trees for One Class Classification in Automated Visual Inspection

verfasst von : Matthias Richter, Thomas Längle, Jürgen Beyerer

Erschienen in: Image Analysis and Recognition

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We present Gaussian mixture trees for density estimation and one class classification. A Gaussian mixture tree is a tree, where each node is associated with a Gaussian component. Each level of the tree provides a refinement of the data description of the level above. We show how this approach is applied to one class classification and how the hierarchical structure is exploited to significantly reduce computation time to make the approach suitable for real time systems. Experiments with synthetic data and data from a visual inspection task show that our approach compares favorably to flat Gaussian mixture models as well as one class support vector machines regarding both predictive performance and computation time.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Diameter, density, area, convex area, compactness, extent, roundness, perimeter, and convex hull perimeter.
 
Literatur
1.
Zurück zum Zitat Malamas, E.N., Petrakis, E.G., Zervakis, M., Petit, L., Legat, J.D.: A survey on industrial vision systems, applications and tools. Image Vis. Comput. 21, 171–188 (2003)CrossRef Malamas, E.N., Petrakis, E.G., Zervakis, M., Petit, L., Legat, J.D.: A survey on industrial vision systems, applications and tools. Image Vis. Comput. 21, 171–188 (2003)CrossRef
3.
Zurück zum Zitat Yuan, M., Wegkamp, M.: Classification methods with reject option based on convex risk minimization. J. Mach. Learn. Res. 11, 111–130 (2010)MathSciNet Yuan, M., Wegkamp, M.: Classification methods with reject option based on convex risk minimization. J. Mach. Learn. Res. 11, 111–130 (2010)MathSciNet
4.
Zurück zum Zitat Dubuisson, B., Masson, M.: A statistical decision rule with incomplete knowledge about classes. Pattern Recogn. 26, 155–165 (1993)CrossRef Dubuisson, B., Masson, M.: A statistical decision rule with incomplete knowledge about classes. Pattern Recogn. 26, 155–165 (1993)CrossRef
5.
Zurück zum Zitat Chandola, V., Banerjee, A., Kumar, V.: Anomaly detection: a survey. ACM Comput. Surv. (CSUR) 41, 15 (2009)CrossRef Chandola, V., Banerjee, A., Kumar, V.: Anomaly detection: a survey. ACM Comput. Surv. (CSUR) 41, 15 (2009)CrossRef
6.
Zurück zum Zitat Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33, 1065–1076 (1962)MathSciNetCrossRef Parzen, E.: On estimation of a probability density function and mode. Ann. Math. Stat. 33, 1065–1076 (1962)MathSciNetCrossRef
7.
Zurück zum Zitat Tax, D.M., Duin, R.P.: Support vector domain description. Pattern Recogn. Lett. 20, 1191–1199 (1999)CrossRef Tax, D.M., Duin, R.P.: Support vector domain description. Pattern Recogn. Lett. 20, 1191–1199 (1999)CrossRef
8.
Zurück zum Zitat Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Comput. 13, 1443–1471 (2001)CrossRef Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Comput. 13, 1443–1471 (2001)CrossRef
9.
Zurück zum Zitat Huang, G., Yang, Z., Chen, X., Ji, G.: An innovative one-class least squares support vector machine model based on continuous cognition. Knowl.-Based Syst. 123, 217–228 (2017)CrossRef Huang, G., Yang, Z., Chen, X., Ji, G.: An innovative one-class least squares support vector machine model based on continuous cognition. Knowl.-Based Syst. 123, 217–228 (2017)CrossRef
10.
Zurück zum Zitat Utkin, L.V., Zhuk, Y.A.: An one-class classification support vector machine model by interval-valued training data. Knowl.-Based Syst. 120, 43–56 (2017)CrossRef Utkin, L.V., Zhuk, Y.A.: An one-class classification support vector machine model by interval-valued training data. Knowl.-Based Syst. 120, 43–56 (2017)CrossRef
11.
Zurück zum Zitat Khan, S.S., Madden, M.G.: One-class classification: taxonomy of study and review of techniques. Knowl. Eng. Rev. 29, 345–374 (2014)CrossRef Khan, S.S., Madden, M.G.: One-class classification: taxonomy of study and review of techniques. Knowl. Eng. Rev. 29, 345–374 (2014)CrossRef
12.
Zurück zum Zitat Désir, C., Bernard, S., Petitjean, C., Heutte, L.: One class random forests. Pattern Recogn. 46, 3490–3506 (2013)CrossRef Désir, C., Bernard, S., Petitjean, C., Heutte, L.: One class random forests. Pattern Recogn. 46, 3490–3506 (2013)CrossRef
13.
Zurück zum Zitat Vasconcelos, N., Lippman, A.: Learning mixture hierarchies. In: NIPS, pp. 606–612 (1998) Vasconcelos, N., Lippman, A.: Learning mixture hierarchies. In: NIPS, pp. 606–612 (1998)
14.
Zurück zum Zitat Williams, C.K.: A MCMC approach to hierarchical mixture modelling. In: NIPS, pp. 680–686 (1999) Williams, C.K.: A MCMC approach to hierarchical mixture modelling. In: NIPS, pp. 680–686 (1999)
15.
Zurück zum Zitat Ram, P., Gray, A.G.: Density estimation trees. In: 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 627–635. ACM (2011) Ram, P., Gray, A.G.: Density estimation trees. In: 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 627–635. ACM (2011)
16.
Zurück zum Zitat Criminisi, A., Shotton, J., Konukoglu, E.: Decision forests: a unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning. Found. Trends® Comput. Graph. Vis. 7, 81–227 (2012)CrossRef Criminisi, A., Shotton, J., Konukoglu, E.: Decision forests: a unified framework for classification, regression, density estimation, manifold learning and semi-supervised learning. Found. Trends® Comput. Graph. Vis. 7, 81–227 (2012)CrossRef
17.
Zurück zum Zitat Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC Press, Boca Raton (1984) Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC Press, Boca Raton (1984)
18.
Zurück zum Zitat Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNet Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNet
19.
Zurück zum Zitat Bezanson, J., Karpinski, S., Shah, V.B., Edelman, A.: Julia: a fast dynamic language for technical computing (2012) Bezanson, J., Karpinski, S., Shah, V.B., Edelman, A.: Julia: a fast dynamic language for technical computing (2012)
20.
Zurück zum Zitat Richter, M., Maier, G., Gruna, R., Längle, T., Beyerer, J.: Feature selection with a budget. In: EECSS 2016, Budapest, Hungary, pp. 104.1–104.8. Avestia Publishing (2016) Richter, M., Maier, G., Gruna, R., Längle, T., Beyerer, J.: Feature selection with a budget. In: EECSS 2016, Budapest, Hungary, pp. 104.1–104.8. Avestia Publishing (2016)
21.
Zurück zum Zitat Richter, M., Längle, T., Beyerer, J.: Visual words for automated visual inspection of bulk materials. In: Machine Vision Applications, Tokyo, Japan, pp. 210–213 (2015) Richter, M., Längle, T., Beyerer, J.: Visual words for automated visual inspection of bulk materials. In: Machine Vision Applications, Tokyo, Japan, pp. 210–213 (2015)
Metadaten
Titel
Gaussian Mixture Trees for One Class Classification in Automated Visual Inspection
verfasst von
Matthias Richter
Thomas Längle
Jürgen Beyerer
Copyright-Jahr
2017
DOI
https://doi.org/10.1007/978-3-319-59876-5_38

Premium Partner