Skip to main content

2015 | OriginalPaper | Buchkapitel

An Adaptive Radial Basis Function Kernel for Support Vector Data Description

verfasst von : André E. Lazzaretti, David M. J. Tax

Erschienen in: Similarity-Based Pattern Recognition

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

For one-class classification or novelty detection, the metric of the feature space is essential for a good performance. Typically, it is assumed that the metric of the feature space is relatively isotropic, or flat, indicating that a distance of 1 can be interpreted in a similar way for every location and direction in the feature space. When this is not the case, thresholds on distances that are fitted in one part of the feature space will be suboptimal for other parts. To avoid this, the idea of this paper is to modify the width parameter in the Radial Basis Function (RBF) kernel for the Support Vector Data Description (SVDD) classifier. Although there have been numerous approaches to learn the metric in a feature space for (supervised) classification problems, for one-class classification this is harder, because the metric cannot be optimized to improve a classification performance. Instead, here we propose to consider the local pairwise distances in the training set. The results obtained on both artificial and real datasets demonstrate the ability of the modified RBF kernel to identify local scales in the input data, extracting its general structure and improving the final classification performance for novelty detection problems.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
A chunklet is defined as a set of points where all data points have an identical but unknown class label.
 
2
This is normally the case in novelty detection problems, where the known class corresponds to one single chunklet.
 
3
The artificial dataset can be generated using uniform random distribution in two-dimensional space, followed by a logarithmic (or even polynomial) operation.
 
4
The selected data sets, arranged for novelty detection problems, are available at http://​homepage.​tudelft.​nl/​n9d04/​occ/​index.​html.
 
Literatur
1.
Zurück zum Zitat Bar-Hillel, A., Weinshall, D.: Learning a mahalanobis metric from equivalence constraints. J. Mach. Learn. Res. 6, 937–965 (2005)MathSciNetMATH Bar-Hillel, A., Weinshall, D.: Learning a mahalanobis metric from equivalence constraints. J. Mach. Learn. Res. 6, 937–965 (2005)MathSciNetMATH
2.
Zurück zum Zitat Bellet, A., Habrard, A., Sebban, M.: A survey on metric learning for feature vectors and structured data. CoRR abs/1306.6709, p. 59 (2013) Bellet, A., Habrard, A., Sebban, M.: A survey on metric learning for feature vectors and structured data. CoRR abs/1306.6709, p. 59 (2013)
3.
Zurück zum Zitat Benoudjit, N., Verleysen, M.: On the kernel widths in radial basis function networks. Neural Process. Lett. 18(2), 139–154 (2003)CrossRef Benoudjit, N., Verleysen, M.: On the kernel widths in radial basis function networks. Neural Process. Lett. 18(2), 139–154 (2003)CrossRef
4.
Zurück zum Zitat Chang, H., Yeung, D.Y.: Locally linear metric adaptation for semi-supervised clustering. In: Proceedings of the Twenty-First International Conference on Machine Learning. pp. 153–160 (2004) Chang, H., Yeung, D.Y.: Locally linear metric adaptation for semi-supervised clustering. In: Proceedings of the Twenty-First International Conference on Machine Learning. pp. 153–160 (2004)
5.
Zurück zum Zitat Chang, Q., Chen, Q., Wang, X.: Scaling gaussian RBF kernel width to improve SVM classification. In: International Conference on Neural Networks and Brain, pp. 19–22 (2005) Chang, Q., Chen, Q., Wang, X.: Scaling gaussian RBF kernel width to improve SVM classification. In: International Conference on Neural Networks and Brain, pp. 19–22 (2005)
6.
Zurück zum Zitat Fawcett, T.: An introduction to ROC analysis. Pattern Recogn. Lett. 27, 861–874 (2006)CrossRef Fawcett, T.: An introduction to ROC analysis. Pattern Recogn. Lett. 27, 861–874 (2006)CrossRef
7.
Zurück zum Zitat Kedem, D., Tyree, S., Weinberger, K.Q., Louis, S., Lanckriet, G.: Non-linear metric learning. In. In Proceedings of Advances in Neural Information Processing Systems, vol. 25, pp. 1–9 (2012) Kedem, D., Tyree, S., Weinberger, K.Q., Louis, S., Lanckriet, G.: Non-linear metric learning. In. In Proceedings of Advances in Neural Information Processing Systems, vol. 25, pp. 1–9 (2012)
8.
Zurück zum Zitat Rabaoui, A., Davy, M., Rossignol, S., Lachiri, Z., Ellouze, N.: Improved one-class SVM classifier for sounds classification. In: IEEE Conference on Advanced Video and Signal Based Surveillance, pp. 1–6 (2007) Rabaoui, A., Davy, M., Rossignol, S., Lachiri, Z., Ellouze, N.: Improved one-class SVM classifier for sounds classification. In: IEEE Conference on Advanced Video and Signal Based Surveillance, pp. 1–6 (2007)
9.
Zurück zum Zitat Schölkopf, B., Smola, A., Smola, E., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10, 1299–1319 (1998)CrossRefMATH Schölkopf, B., Smola, A., Smola, E., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10, 1299–1319 (1998)CrossRefMATH
10.
Zurück zum Zitat Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond. MIT Press, Cambridge (2001) Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond. MIT Press, Cambridge (2001)
11.
Zurück zum Zitat Shental, N., Hertz, T., Weinshall, D., Pavel, M.: Adjustment learning and relevant component analysis. In: Proceedings of the 7th European Conference on Computer Vision, pp. 776–792 (2002) Shental, N., Hertz, T., Weinshall, D., Pavel, M.: Adjustment learning and relevant component analysis. In: Proceedings of the 7th European Conference on Computer Vision, pp. 776–792 (2002)
12.
Zurück zum Zitat Tax, D.M.J., Duin, R.P.W.: Support vector data description. Mach. Learn. 54, 45–66 (2004)CrossRefMATH Tax, D.M.J., Duin, R.P.W.: Support vector data description. Mach. Learn. 54, 45–66 (2004)CrossRefMATH
13.
Zurück zum Zitat Tax, D.M.J., Juszczak, P.: Kernel whitening for one-class classification. In: First International Workshop on Pattern Recognition with Support Vector Machines, pp. 40–52 (2002) Tax, D.M.J., Juszczak, P.: Kernel whitening for one-class classification. In: First International Workshop on Pattern Recognition with Support Vector Machines, pp. 40–52 (2002)
14.
Zurück zum Zitat Tax, D.M.J.: One-class classification. Ph.D. thesis, Technische Universiteit Delft (2001) Tax, D.M.J.: One-class classification. Ph.D. thesis, Technische Universiteit Delft (2001)
15.
Zurück zum Zitat Tsang, I., Kwok, J.: Kernel relevant component analysis for distance metric learning. In: Proceedings of the IEEE International Joint Conference on Neural Networks, pp. 954–959 (2005) Tsang, I., Kwok, J.: Kernel relevant component analysis for distance metric learning. In: Proceedings of the IEEE International Joint Conference on Neural Networks, pp. 954–959 (2005)
16.
Zurück zum Zitat Vapnik, V., Chapelle, O.: Bounds on error expectation for support vector machines. Neural Comput. 12(9), 2013–2036 (2000)CrossRef Vapnik, V., Chapelle, O.: Bounds on error expectation for support vector machines. Neural Comput. 12(9), 2013–2036 (2000)CrossRef
17.
Zurück zum Zitat Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)MATH Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)MATH
18.
Zurück zum Zitat Wang, Z., Gao, D., Pan, Z.: An effective support vector data description with relevant metric learning. In: 7th International Symposium on Neural Networks, pp. 42–51 (2010) Wang, Z., Gao, D., Pan, Z.: An effective support vector data description with relevant metric learning. In: 7th International Symposium on Neural Networks, pp. 42–51 (2010)
19.
Zurück zum Zitat Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10, 207–244 (2009)MATH Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10, 207–244 (2009)MATH
20.
Zurück zum Zitat Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: Advances in Neural Information Processing Systems, vol. 15, pp. 505–512. MIT Press, Cambridge (2002) Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: Advances in Neural Information Processing Systems, vol. 15, pp. 505–512. MIT Press, Cambridge (2002)
21.
Zurück zum Zitat Xu, Z.E., Weinberger, K.Q., Sha, F.: Distance metric learning for kernel machines. CoRR abs/1208.3422, pp. 1–17 (2012) Xu, Z.E., Weinberger, K.Q., Sha, F.: Distance metric learning for kernel machines. CoRR abs/1208.3422, pp. 1–17 (2012)
22.
Zurück zum Zitat Yeung, D.Y., Chang, H.: A kernel approach for semisupervised metric learning. IEEE Trans. Neural Netw. 18(1), 141–149 (2007)CrossRef Yeung, D.Y., Chang, H.: A kernel approach for semisupervised metric learning. IEEE Trans. Neural Netw. 18(1), 141–149 (2007)CrossRef
23.
Zurück zum Zitat Yeung, D.Y., Chang, H., Dai, G.: A scalable kernel-based algorithm for semi-supervised metric learning. Neural Comput. 20(11), 1138–1143 (2008)CrossRef Yeung, D.Y., Chang, H., Dai, G.: A scalable kernel-based algorithm for semi-supervised metric learning. Neural Comput. 20(11), 1138–1143 (2008)CrossRef
24.
Zurück zum Zitat Ypma, A.: Learning methods for machine vibration analysis and health monitoring. Ph.D. thesis, Technische Universiteit Delft (2001) Ypma, A.: Learning methods for machine vibration analysis and health monitoring. Ph.D. thesis, Technische Universiteit Delft (2001)
Metadaten
Titel
An Adaptive Radial Basis Function Kernel for Support Vector Data Description
verfasst von
André E. Lazzaretti
David M. J. Tax
Copyright-Jahr
2015
DOI
https://doi.org/10.1007/978-3-319-24261-3_9