Skip to main content
Erschienen in: Neural Computing and Applications 8/2012

01.11.2012 | Original Article

Efficient support vector data descriptions for novelty detection

verfasst von: Xinjun Peng, Dong Xu

Erschienen in: Neural Computing and Applications | Ausgabe 8/2012

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Support vector data description (SVDD) is a very attractive kernel method in many novelty detection problems. However, the decision function of SVDD is expressed in terms of the kernel expansion, which results in a run-time complexity linear in the number of support vectors (SVs). In this paper, an efficient SVDD (E-SVDD) is proposed to improve the prediction speed of SVDD. This proposed E-SVDD first finds some crucial feature vectors by the partitioning-entropy-based kernel fuzzy c-means (KFCM) cluster technique and then uses the images of the preimages of these feature vectors to reexpress the center of SVDD. Hence, the decision function of E-SVDD only contains some crucial kernel terms, and the complexity of E-SVDD is linear in the number of the clusters. The experimental results on several benchmark datasets indicate that E-SVDD not only obtains fast prediction speed but also shows better stable generalization than other methods.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Markou M, Singh S (2003) Novelty detection: a review, part I: statistical approaches. Signal Process 83(12):2481–2497MATHCrossRef Markou M, Singh S (2003) Novelty detection: a review, part I: statistical approaches. Signal Process 83(12):2481–2497MATHCrossRef
2.
Zurück zum Zitat Markou M, Singh S (2003) Novelty detection: a review, part II: neural network based approaches. Signal Process 83(12):2499–2521MATHCrossRef Markou M, Singh S (2003) Novelty detection: a review, part II: neural network based approaches. Signal Process 83(12):2499–2521MATHCrossRef
3.
Zurück zum Zitat Lee K, Kim D, Lee K, Lee D (2007) Density-induced support vector data desciption. IEEE Trans Neural Netw 18(1):284–289CrossRef Lee K, Kim D, Lee K, Lee D (2007) Density-induced support vector data desciption. IEEE Trans Neural Netw 18(1):284–289CrossRef
4.
Zurück zum Zitat Vapnik V (1995) The natural of statistical learning theory. Springer, New York Vapnik V (1995) The natural of statistical learning theory. Springer, New York
5.
Zurück zum Zitat Vapnik V (1998) Statistical learning theory. Wiley, New YorkMATH Vapnik V (1998) Statistical learning theory. Wiley, New YorkMATH
6.
Zurück zum Zitat Campbell C, Bennett K (2001) A linear programming approach to novelty detection. In: Advances in neural information processing systems, vol 13. MIT Press, Cambridge, MA Campbell C, Bennett K (2001) A linear programming approach to novelty detection. In: Advances in neural information processing systems, vol 13. MIT Press, Cambridge, MA
7.
Zurück zum Zitat Tax D, Duin R (1999) Support vector domain description. Pattern Recogn Lett 20:1191–1199CrossRef Tax D, Duin R (1999) Support vector domain description. Pattern Recogn Lett 20:1191–1199CrossRef
9.
Zurück zum Zitat Roberts S, Tarassenko L (1994) A probabilistic resource allocation network for novelty detection. Neural Comput Appl 6:270–284CrossRef Roberts S, Tarassenko L (1994) A probabilistic resource allocation network for novelty detection. Neural Comput Appl 6:270–284CrossRef
10.
Zurück zum Zitat Crammer K, Chechik G (2004) A needle in a haystack: local one-class optimization. In: Proceedings of 21th international conference on machine learning, p 26 Crammer K, Chechik G (2004) A needle in a haystack: local one-class optimization. In: Proceedings of 21th international conference on machine learning, p 26
11.
Zurück zum Zitat Hoffmann H (2007) Kernel PCA for novelty detection. Pattern Recogn Lett 40(3):863–874MATHCrossRef Hoffmann H (2007) Kernel PCA for novelty detection. Pattern Recogn Lett 40(3):863–874MATHCrossRef
12.
Zurück zum Zitat Lanckriet G, Ghaoui L, Jordan M (2003) Robust novelty detection with single-class MPM. In: Advances in neural information processing systems, vol 15. MIT Press, Cambridge, MA, pp 92–936 Lanckriet G, Ghaoui L, Jordan M (2003) Robust novelty detection with single-class MPM. In: Advances in neural information processing systems, vol 15. MIT Press, Cambridge, MA, pp 92–936
13.
Zurück zum Zitat Schölkopf B, Mika S, Burges C, Knirsch P, Müller K, Rätsch G, Smola A (1999) Input space vs. feature space in kernel-based methods. IEEE Trans Neural Netw 10(5):1000–1017CrossRef Schölkopf B, Mika S, Burges C, Knirsch P, Müller K, Rätsch G, Smola A (1999) Input space vs. feature space in kernel-based methods. IEEE Trans Neural Netw 10(5):1000–1017CrossRef
14.
Zurück zum Zitat Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge Schölkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge
15.
16.
Zurück zum Zitat Steinwart I, Hush D, Scovel C (2005) A classification framework for anomaly detection. J Mach Learn Res 6:211–232MathSciNetMATH Steinwart I, Hush D, Scovel C (2005) A classification framework for anomaly detection. J Mach Learn Res 6:211–232MathSciNetMATH
17.
Zurück zum Zitat Vert R, Vert J (2006) Consistency and convergence rates of one-class SVM and related algorithms. J Mach Learn Res 7:817–854MathSciNetMATH Vert R, Vert J (2006) Consistency and convergence rates of one-class SVM and related algorithms. J Mach Learn Res 7:817–854MathSciNetMATH
18.
Zurück zum Zitat Liu Y, Lin S, Hsueh Y, M.L. L (2009) Automatic target defect identification for TFT-LCD array process inspection using kernel FCM-based fuzzy SVDD ensemble. Expert Syst Appl 36(2):1978–1998CrossRef Liu Y, Lin S, Hsueh Y, M.L. L (2009) Automatic target defect identification for TFT-LCD array process inspection using kernel FCM-based fuzzy SVDD ensemble. Expert Syst Appl 36(2):1978–1998CrossRef
20.
Zurück zum Zitat Nanni L (2006) Machine learning algorithms for T-cell epitopes prediction. Neurocomputing 69(7–9):866–868CrossRef Nanni L (2006) Machine learning algorithms for T-cell epitopes prediction. Neurocomputing 69(7–9):866–868CrossRef
21.
Zurück zum Zitat Banerjee A, Burlina P, Diehl C (2006) A support vector method for anomaly detection in hyperspectral imagery. IEEE Trans Geosci Remote Sens 44(8):2282–2291CrossRef Banerjee A, Burlina P, Diehl C (2006) A support vector method for anomaly detection in hyperspectral imagery. IEEE Trans Geosci Remote Sens 44(8):2282–2291CrossRef
22.
Zurück zum Zitat Burges C (1996) Simplified support vector decision rules. In: Proceedings of the 13th international conference on machine learning, pp 71–77 Burges C (1996) Simplified support vector decision rules. In: Proceedings of the 13th international conference on machine learning, pp 71–77
23.
Zurück zum Zitat Thies T, Weber F (2004) Optimal reduced-set vectors for support vector machines with a quadratic kernel. Neural Comput Appl 16:1769–1777MATHCrossRef Thies T, Weber F (2004) Optimal reduced-set vectors for support vector machines with a quadratic kernel. Neural Comput Appl 16:1769–1777MATHCrossRef
24.
Zurück zum Zitat Downs T, Gates K, Masters A (2002) Exact simplification of support vector solutions. J Mach Learn Res 2:293–297MathSciNetMATH Downs T, Gates K, Masters A (2002) Exact simplification of support vector solutions. J Mach Learn Res 2:293–297MathSciNetMATH
25.
Zurück zum Zitat Guo J, Takahashi N, Nishi T (2005) A learning algorithm for improving the classification speed of support vector machines. In: Proceedings of 2005 European conference on circuit theory and design Guo J, Takahashi N, Nishi T (2005) A learning algorithm for improving the classification speed of support vector machines. In: Proceedings of 2005 European conference on circuit theory and design
26.
Zurück zum Zitat Jiao L, Bo L, Wang L (2007) Fast sparse approximation for least squares support vector machine. IEEE Trans Neural Netw 18:685–697CrossRef Jiao L, Bo L, Wang L (2007) Fast sparse approximation for least squares support vector machine. IEEE Trans Neural Netw 18:685–697CrossRef
27.
Zurück zum Zitat Suykens J, Lukas L, van Dooren P, De Moor B, Vandewalle J (1999) Least squares support vector machine classifiers: a large scale algorithm. In: Proceedings of European conference of circuit theory design, pp 839–842 Suykens J, Lukas L, van Dooren P, De Moor B, Vandewalle J (1999) Least squares support vector machine classifiers: a large scale algorithm. In: Proceedings of European conference of circuit theory design, pp 839–842
28.
Zurück zum Zitat Suykens J, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300MathSciNetCrossRef Suykens J, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300MathSciNetCrossRef
29.
Zurück zum Zitat Bo L, Wang L, Jiao L (2007) Selecting a reduced set for building sparse support vector regression in the primal. In: Advances in knowledge discovery and data mining, pp 35–46 Bo L, Wang L, Jiao L (2007) Selecting a reduced set for building sparse support vector regression in the primal. In: Advances in knowledge discovery and data mining, pp 35–46
30.
Zurück zum Zitat Liang X, Chen R, Guo X (2008) Pruning support vector machines without altering performances. IEEE Trans Neural Netw 19(10):1792–1803CrossRef Liang X, Chen R, Guo X (2008) Pruning support vector machines without altering performances. IEEE Trans Neural Netw 19(10):1792–1803CrossRef
31.
Zurück zum Zitat Li Q, Jiao L, Hao Y (2007) Adaptive simplification of solution for support vector machine. Pattern Recogn Lett 40:972–980MATH Li Q, Jiao L, Hao Y (2007) Adaptive simplification of solution for support vector machine. Pattern Recogn Lett 40:972–980MATH
32.
Zurück zum Zitat Joachims T, Yu C (2009) Sparse kernel SVMs via cutting-plane training. Mach Learn 76(2-3):179–193CrossRef Joachims T, Yu C (2009) Sparse kernel SVMs via cutting-plane training. Mach Learn 76(2-3):179–193CrossRef
33.
Zurück zum Zitat Liu Y, Liu Y, Chen Y (2010) Fast support vector data description for novelty detection. IEEE Trans Neural Netw 21(8):1296–1313CrossRef Liu Y, Liu Y, Chen Y (2010) Fast support vector data description for novelty detection. IEEE Trans Neural Netw 21(8):1296–1313CrossRef
34.
Zurück zum Zitat Mercer J (1909) Functions of positive and negative type and the connection with the theory of integal equations. Philos Trans R Soc Lond Ser A 209:415–446MATHCrossRef Mercer J (1909) Functions of positive and negative type and the connection with the theory of integal equations. Philos Trans R Soc Lond Ser A 209:415–446MATHCrossRef
35.
Zurück zum Zitat Bakir G, Weston J, Schölkopf B (2004) Learning to find pre-images. In: Thrun J, Saul L, Schölkopf B (eds) Advances in neural information processing systems. vol 16, MIT Press, Cambridge, MA, pp 449–456 Bakir G, Weston J, Schölkopf B (2004) Learning to find pre-images. In: Thrun J, Saul L, Schölkopf B (eds) Advances in neural information processing systems. vol 16, MIT Press, Cambridge, MA, pp 449–456
36.
Zurück zum Zitat Burges C, Schölkopf B (1997) Improving the accuracy and speed of support vector learning machines. In: Mozer M, Jordan M, Petsche T (eds) Advances in neural information processing systems. vol 9, MIT Press, Cambridge, MA, pp 375–381 Burges C, Schölkopf B (1997) Improving the accuracy and speed of support vector learning machines. In: Mozer M, Jordan M, Petsche T (eds) Advances in neural information processing systems. vol 9, MIT Press, Cambridge, MA, pp 375–381
37.
Zurück zum Zitat Kwok J, Tsang I (2004) The pre-image problem in kernel methods. IEEE Trans Neural Netw 15(6):1517–1525CrossRef Kwok J, Tsang I (2004) The pre-image problem in kernel methods. IEEE Trans Neural Netw 15(6):1517–1525CrossRef
38.
Zurück zum Zitat Mika S, Schölkopf B, Smola A, Müller K, Scholz M, Rätsch G (1998) Kernel pca and de-noising in feature space. In: Kearns M, Solla S, Cohn D (eds) Advances in neural information processing systems, vol 11. Morgan Kaufmann, San Mateo, CA Mika S, Schölkopf B, Smola A, Müller K, Scholz M, Rätsch G (1998) Kernel pca and de-noising in feature space. In: Kearns M, Solla S, Cohn D (eds) Advances in neural information processing systems, vol 11. Morgan Kaufmann, San Mateo, CA
39.
Zurück zum Zitat Bezdek J (1981) Pattern recognition with fuzzy objective function algorithms. Plenum Press, New YorkCrossRef Bezdek J (1981) Pattern recognition with fuzzy objective function algorithms. Plenum Press, New YorkCrossRef
40.
Zurück zum Zitat Wu Z, Xie W (2003) Fuzzy c-means clustering algorithm based on kernel method. In: Proceedings of fifth international conference on computational intelligence and multimedia applications, pp 49–54 Wu Z, Xie W (2003) Fuzzy c-means clustering algorithm based on kernel method. In: Proceedings of fifth international conference on computational intelligence and multimedia applications, pp 49–54
41.
Zurück zum Zitat Lin C, Lee C (1996) Neural fuzzy systems: a neuro-fuzzy synergism to intelligent system. Prentice-Hall, Englewood Cliffs, NJ Lin C, Lee C (1996) Neural fuzzy systems: a neuro-fuzzy synergism to intelligent system. Prentice-Hall, Englewood Cliffs, NJ
42.
Zurück zum Zitat Höppner F, Klawonn F, Kruse R, Runkler T (1999) Fuzzy cluster analysis: methods for classification, data analysis and image recognition. Wiley, New YorkMATH Höppner F, Klawonn F, Kruse R, Runkler T (1999) Fuzzy cluster analysis: methods for classification, data analysis and image recognition. Wiley, New YorkMATH
43.
Zurück zum Zitat Huang H, Liu Y (2002) Fuzzy support vector machines for pattern recognition and data mining. Int J Fuzzy Syst 4:826–835MathSciNet Huang H, Liu Y (2002) Fuzzy support vector machines for pattern recognition and data mining. Int J Fuzzy Syst 4:826–835MathSciNet
44.
Zurück zum Zitat Meyer C (2000) Matrix analysis and applied linear algebra. SIAM, Philadelphia Meyer C (2000) Matrix analysis and applied linear algebra. SIAM, Philadelphia
45.
Zurück zum Zitat Kubat M, Matwin S (1997) Addressing the curse of imbalanced training sets: one-sided selection. In: Proceedings of 14th international conference on machine learning Kubat M, Matwin S (1997) Addressing the curse of imbalanced training sets: one-sided selection. In: Proceedings of 14th international conference on machine learning
46.
Zurück zum Zitat Wu G, Chang E (2003) Class-boundary alignment for imbalanced dataset learning. In: Proceedings of international conference on machine learning workshop learning from imbalanced datasets Wu G, Chang E (2003) Class-boundary alignment for imbalanced dataset learning. In: Proceedings of international conference on machine learning workshop learning from imbalanced datasets
Metadaten
Titel
Efficient support vector data descriptions for novelty detection
verfasst von
Xinjun Peng
Dong Xu
Publikationsdatum
01.11.2012
Verlag
Springer-Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 8/2012
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-011-0625-3

Weitere Artikel der Ausgabe 8/2012

Neural Computing and Applications 8/2012 Zur Ausgabe