Skip to main content

2020 | OriginalPaper | Buchkapitel

Some Computational Considerations for Kernel-Based Support Vector Machine

verfasst von : Mohsen Esmaeilbeigi, Alireza Daneshkhah, Omid Chatrabgoun

Erschienen in: Digital Twin Technologies and Smart Cities

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Sometimes healthcare perspectives in communications technologies require data mining, especially classification as a supervised learning. Support vector machines (SVMs) are considered as efficient supervised learning approaches for classification due to their robustness against several types of model misspecifications and outliers. Kernel-based SVMs are known to be more flexible tools for a wide range of supervised learning tasks and can efficiently handle non-linear relationship between input variables and outputs (or labels). They are more robust with respect to the aforementioned model misspecifications, and also more accurate in the sense that the root-mean-square error computed by fitting the kernel-based SVMs is considerably smaller than the one computed by fitting the standard/linear SVMs. However, the choice of kernel type and particularity kernel’s parameters could have significant impact on the classification accuracy and other supervised learning tasks required in network security, Internet of things, cybersecurity, etc. One of the findings of this study is that larger kernel parameter(s) would encourage SVMs with more localities and vice versa. This chapter provides some results on the effect of the kernel parameter on the kernel-based SVM classification. We thus first examine the effect of these parameters on the classification results using the kernel-based SVM, and then specify the optimal value of these parameters using cross-validation (CV) technique.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Hastie, T., Tibshirani, R., Friedman, J.: Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Series in Statistics, 2nd edn. Springer, New York (2009)MATHCrossRef Hastie, T., Tibshirani, R., Friedman, J.: Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Series in Statistics, 2nd edn. Springer, New York (2009)MATHCrossRef
2.
Zurück zum Zitat Han, J., Kamber, M., Pei, J.: Data Mining Concepts and Techniques, 3rd edn. Elsevier Inc., Waltham, USA (2012)MATH Han, J., Kamber, M., Pei, J.: Data Mining Concepts and Techniques, 3rd edn. Elsevier Inc., Waltham, USA (2012)MATH
3.
Zurück zum Zitat Nickisch, H., Rasmussen, C.E.: Approximations for binary gaussian process classification. J. Mach. Learn. Res. 9, 2035–2078 (2008)MathSciNetMATH Nickisch, H., Rasmussen, C.E.: Approximations for binary gaussian process classification. J. Mach. Learn. Res. 9, 2035–2078 (2008)MathSciNetMATH
4.
Zurück zum Zitat Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press (2006) Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press (2006)
5.
Zurück zum Zitat Arajo, R.A., Oliveira, A.L.I., Meira, S.: A morphological neural network for binary classification problems. Eng. Appl. Artif. Intell. 65, 12–28 (2017)CrossRef Arajo, R.A., Oliveira, A.L.I., Meira, S.: A morphological neural network for binary classification problems. Eng. Appl. Artif. Intell. 65, 12–28 (2017)CrossRef
6.
Zurück zum Zitat Qian, G., Zhang, L.: A simple feedforward convolutional conceptor neural network for classification. Appl. Soft Comput. 70, 1034–1041 (2018)CrossRef Qian, G., Zhang, L.: A simple feedforward convolutional conceptor neural network for classification. Appl. Soft Comput. 70, 1034–1041 (2018)CrossRef
7.
8.
Zurück zum Zitat Steinwart, I., Christmann, A.: Support Vector Machines, Information Sciences and Statistics. Springer, New York (2008)MATH Steinwart, I., Christmann, A.: Support Vector Machines, Information Sciences and Statistics. Springer, New York (2008)MATH
9.
Zurück zum Zitat Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (2013)MATH Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (2013)MATH
10.
Zurück zum Zitat Burges, C.J.: A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Discov 2(2), 121–167 (1998)CrossRef Burges, C.J.: A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Discov 2(2), 121–167 (1998)CrossRef
11.
Zurück zum Zitat Suykens, J.A.: Advances in Learning Theory: Methods, Models, and Applications, vol. 190. IOS Press (2003) Suykens, J.A.: Advances in Learning Theory: Methods, Models, and Applications, vol. 190. IOS Press (2003)
12.
Zurück zum Zitat Scholkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002) Scholkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)
13.
Zurück zum Zitat Fasshauer, G., McCourt, M.: Kernel-Based Approximation Method using Matlab. World Scientific Publishing (2016) Fasshauer, G., McCourt, M.: Kernel-Based Approximation Method using Matlab. World Scientific Publishing (2016)
14.
Zurück zum Zitat Cover, T.M.: Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Trans. Electron. 14, 326–334 (1965)MATHCrossRef Cover, T.M.: Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Trans. Electron. 14, 326–334 (1965)MATHCrossRef
15.
Zurück zum Zitat Poggio, T., Mukherjee, S., Rifkin, R., Rakhlin, A., Verri, A.: Technical report, MIT AI Memo 2001-011 (2001) Poggio, T., Mukherjee, S., Rifkin, R., Rakhlin, A., Verri, A.: Technical report, MIT AI Memo 2001-011 (2001)
16.
Zurück zum Zitat Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Scholkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods, pp. 185–208. MIT Press, Cambridge, MA (1999) Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Scholkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods, pp. 185–208. MIT Press, Cambridge, MA (1999)
17.
Zurück zum Zitat Fine, S., Scheinberg, K.: Efficient SVM training using low-rank kernel representations. J. Mach. Learn. Res. 2, 243–264 (2002)MATH Fine, S., Scheinberg, K.: Efficient SVM training using low-rank kernel representations. J. Mach. Learn. Res. 2, 243–264 (2002)MATH
18.
Zurück zum Zitat Hickernell, F.J., Hon, Y.C.: Radial basis function approximations as smoothing splines. Appl. Math. Comput. 102(1), 1–24 (1999)MathSciNetMATHCrossRef Hickernell, F.J., Hon, Y.C.: Radial basis function approximations as smoothing splines. Appl. Math. Comput. 102(1), 1–24 (1999)MathSciNetMATHCrossRef
19.
Zurück zum Zitat Rippa, S.: An algorithm for selecting a good value for the parameter c in radial basis function interpolation. Adv. Comput. Math. 11(2–3), 193–210 (1999) Rippa, S.: An algorithm for selecting a good value for the parameter c in radial basis function interpolation. Adv. Comput. Math. 11(2–3), 193–210 (1999)
20.
Zurück zum Zitat Fasshauer, G.E., Zhang, J.G.: On choosing “optimal” shape parameters for RBF approximation. Numer. Algorithms 45(1–4), 345–368 (2007)MathSciNetMATHCrossRef Fasshauer, G.E., Zhang, J.G.: On choosing “optimal” shape parameters for RBF approximation. Numer. Algorithms 45(1–4), 345–368 (2007)MathSciNetMATHCrossRef
21.
Zurück zum Zitat Fasshauer, G.E.: Meshfree Approximation Methods with Matlab, Interdisciplinary Mathematical Sciences, vol. 6. World Scientific Publishing Co., Singapore (2007)MATHCrossRef Fasshauer, G.E.: Meshfree Approximation Methods with Matlab, Interdisciplinary Mathematical Sciences, vol. 6. World Scientific Publishing Co., Singapore (2007)MATHCrossRef
Metadaten
Titel
Some Computational Considerations for Kernel-Based Support Vector Machine
verfasst von
Mohsen Esmaeilbeigi
Alireza Daneshkhah
Omid Chatrabgoun
Copyright-Jahr
2020
DOI
https://doi.org/10.1007/978-3-030-18732-3_10

Neuer Inhalt