Skip to main content

2022 | OriginalPaper | Buchkapitel

3. Support Vector Machine Classification

verfasst von : Yong Shi

Erschienen in: Advances in Big Data Analytics

Verlag: Springer Nature Singapore

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Support vector machine (SVM) has been a popular technique in data analytics. Shi et al. [1] has reported some SVM algorithms. They vary from leave-one-out (LOO) bounds approaches, multi-class, unsupervised, semi-supervised and robust SVMs. Following the direction of the research afterwards, this Chapter provides five sections about advances of SVM in big data analytics. Section 3.1 has two subsections. The first one outlines the recent findings of the author’s research team on SVM [2] while the second one is about two new decomposition algorithms for training bound-constrained SVM [3]. Section 3.2 describes different twin SVM in classification with four subsections. The first one explores the improved twin SVM [4]. The second one is extending twin SVM for multi-category classification problems [5]. The third one provides robust twin SVM for pattern classification [6]. The fourth one elaborates structural twin SVM for classification [7]. Section 3.3 shows nonparallel SVM with four subsections. The first one is about a nonparallel SVM for a classification problem with universum learning [8]. The second one is about a divide-and-combine method for large scale nonparallel SVM [9]. The third one explores nonparallel SVM for pattern classification [4]. The fourth one is a multi-instance learning algorithm based on nonparallel classifier [10]. Section 3.4 shows Laplacian SVM classifiers with two subsections. One is about successive overrelaxation for Laplacian SVM [11] while another one is about Laplacian twin SVM for semi-supervised classification [12]. Finally, Sect. 3.5 discusses loss functions of SVM classification with three subsections. The first one is about the ramp loss least squares SVM [13]. The second is about the ramp loss nonparallel SVM for pattern classification [14]. The third one is about a classification model using privileged information and its application [10].

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
  1. Shi, Y., Tian, Y., Kou, G., Peng, Y., Li, J.: Optimization Based Data Mining: Theory and Applications. Springer Science & Business Media, New York (2011)MATHView Article
  2. Tian, Y., Shi, Y., Liu, X.: Recent advances on support vector machines research. Technol. Econ. Dev. Econ. 18(1), 5–33 (2012)View Article
  3. Niu, L., Zhou, R., Zhao, X., Shi, Y.: Two new decomposition algorithms for training bound-constrained support vector machines. Found. Comput. Decis. Sci. 40(1), 67–86 (2015)MathSciNetMATHView Article
  4. Tian, Y., Ju, X., Qi, Z., Shi, Y.: Improved twin support vector machine. Sci. China Math. 57(2), 417–432 (2014)MathSciNetMATHView Article
  5. Xie, J., Hone, K., Xie, W., Gao, X., Shi, Y., Liu, X.: Extending twin support vector machine classifier for multi-category classification problems. Intell. Data Anal. 17(4), 649–664 (2013)View Article
  6. Qi, Z., Tian, Y., Shi, Y.: Robust twin support vector machine for pattern classification. Pattern Recogn. 46(1), 305–316 (2013)MATHView Article
  7. Qi, Z., Tian, Y., Shi, Y.: Structural twin support vector machine for classification. Knowl. Based Syst. 43, 74–81 (2013)View Article
  8. Qi, Z., Tian, Y., Shi, Y.: A nonparallel support vector machine for a classification problem with universum learning. J. Computat. Appl. Math. 263, 288–298 (2014)MathSciNetMATHView Article
  9. Tian, Y., Ju, X., Shi, Y.: A divide-and-combine method for large scale nonparallel support vector machines. Neural Netw. 75, 12–21 (2016)MATHView Article
  10. Qi, Z., Tian, Y., Shi, Y.: A new classification model using privileged information and its application. Neurocomputing. 129, 146–152 (2014)View Article
  11. Qi, Z., Tian, Y., Shi, Y.: Successive overrelaxation for laplacian support vector machine. IEEE Trans. Neural Netw. Learn. Syst. 26(4), 674–683 (2014)MathSciNetView Article
  12. Qi, Z., Tian, Y., Shi, Y.: Laplacian twin support vector machine for semi-supervised classification. Neural Netw. 35, 46–53 (2012)MATHView Article
  13. Liu, D., Shi, Y., Tian, Y., Huang, X.: Ramp loss least squares support vector machine. J. Computat. Sci. 14, 61–68 (2016)MathSciNetView Article
  14. Liu, D., Shi, Y., Tian, Y.: Ramp loss nonparallel support vector machine for pattern classification. Knowl. Based Syst. 85, 224–233 (2015)View Article
  15. Deng, N., Tian, Y.: Support Vector Machines: A New Method in Data Mining. Science Press, Beijing, China (2004)
  16. Deng, N., Tian, Y.: Support Vector Machines-Theory, Algorithms and Development. Science Press, Beijing, China (2009)
  17. Deng, N., Tian, Y., Zhang, C.: Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions. CRC Press, Boca Raton, FL (2012)MATHView Article
  18. Vapnik, V., Vapnik, V.: Statistical Learning Theory, pp. 156–160. Springer, Berlin (1998)MATH
  19. Zhang, C., Tian, Y., Deng, N.: The new interpretation of support vector machines on statistical learning theory. Sci China Ser A Math. 53(1), 151–164 (2010)MathSciNetMATHView Article
  20. Suykens, J.A., Van Gestel, T., De Brabanter, J.: Least squares support vector machines. World Sci. (2002)
  21. Shao, Y.H., Zhang, C.H., Wang, X.B., Deng, N.Y.: Improvements on twin support vector machines. IEEE Trans. Neural Netw. 22(6), 962–968 (2011)View Article
  22. Ataman, K., Street, W.N.: Optimizing area under the roc curve using ranking svms. In: Proceedings of International Conference on Knowledge Discovery in Data Mining (2005)
  23. Brefeld, U., Scheffer, T.: Auc maximizing support vector learning. In: Proceedings of the ICML 2005 Workshop on ROC Analysis in Machine Learning (2005)
  24. Goswami, A., Jin, R., Agrawal, G.: Fast and exact out-of-core k-means clustering. In: Fourth IEEE International Conference on Data Mining (ICDM’04), pp. 83–90. IEEE, New York (2004)View Article
  25. Lin, C.F., Wang, S.D.: Fuzzy support vector machines. IEEE Trans. Neural Netw. 13(2), 464–471 (2002)View Article
  26. Akbani, R., Kwek, S., Japkowicz, N.: Applying support vector machines to imbalanced datasets. In: European Conference on Machine Learning, pp. 39–50. Springer, New York (2004)
  27. Herbrich, R., Graepel, T., Obermayer, K.: Support vector learning for ordinal regression. In: 1999 Ninth International Conference on Artificial Neural Networks ICANN 99. IEEE, New York (1999)
  28. Yang, Z.: Support vector ordinal regression and multi-class problems. Ph.D. thesis, China Agricultural University (2007)
  29. Yang, Z., Deng, N., Tian, Y.: A multi-class classification algorithm based on ordinal regression machine. In: International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC’06), vol. 2, pp. 810–815. IEEE, New York (2005)
  30. Joachims, T.: Svmlight: support vector machine. SVM-Light Support Vector Machine. http://​svmlight.​joachims.​org/​, University of Dortmund 19(4) (1999)
  31. Xu, L., Schuurmans, D.: Unsupervised and semi-supervised multi-class support vector machines. AAAI. 40, 50 (2005)
  32. Zhao, K., Tian, Y.J., Deng, N.Y.: Unsupervised and semi-supervised two-class support vector machines. In: Sixth IEEE International Conference on Data Mining-Workshops (ICDMW’06), pp. 813–817. IEEE, New York (2006)
  33. Zhao, K., Tian, Y.J., Deng, N.Y.: Unsupervised and semi-supervised Lagrangian support vector machines. In: International Conference on Computational Science, pp. 882–889. Springer, New York (2007)
  34. Angulo, C., Català, A.: K-svcr. A multi-class support vector machine. In: European Conference on Machine Learning, pp. 31–38. Springer, New York (2000)
  35. Gao, T.: U-support vector machine and its applications. Master’s thesis, China Agricultural University (2008)
  36. Goldfarb, D., Iyengar, G.: Robust convex quadratically constrained programs. Math. Program. 97(3), 495–515 (2003)MathSciNetMATHView Article
  37. Fung, G., Mangasarian, O.L., Shavlik, J.W.: Knowledge-based support vector machine classifiers. In: NIPS, pp. 521–528. Citeseer (2002)
  38. Mangasarian, O.L., Wild, E.W.: Multisurface proximal support vector machine classification via generalized eigenvalues. IEEE Trans. Pattern Anal. Mach. Intell. 28(1), 69–74 (2005)View Article
  39. Vapnik, V., Vashist, A.: A new learning paradigm: learning using privileged information. Neural Netw. 22(5–6), 544–557 (2009)MATHView Article
  40. Mangasarian, O.L., Wild, E.W.: Nonlinear knowledge-based classification. IEEE Trans. Neural Netw. 19(10), 1826–1832 (2008)View Article
  41. Frie, T.T., Cristianini, N., Campbell, C.: The kernel-adatron algorithm: a fast and simple learning procedure for support vector machines. In: Machine Learning: Proceedings of the Fifteenth International Conference (ICML’9m8), pp. 188–196. Citeseer (1998)
  42. Mangasarian, O.L., Musicant, D.R.: Successive overrelaxation for support vector machines. IEEE Trans. Neural Netw. 10(5), 1032–1037 (1999)View Article
  43. Hsieh, C.J., Chang, K.W., Lin, C.J., Keerthi, S.S., Sundararajan, S.: A dual coordinate descent method for large-scale linear svm. In: Proceedings of the 25th International Conference on Machine Learning, pp. 408–415 (2008)View Article
  44. Joachims, T.: Training linear svms in linear time. In: Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 217–226 (2006)View Article
  45. Joachims, T., Finley, T., Yu, C.N.J.: Cutting-plane training of structural svms. Mach. Learn. 77(1), 27–59 (2009)MATHView Article
  46. Joachims, T., Yu, C.N.J.: Sparse kernel svms via cutting-plane training. Mach. Learn. 76(2), 179–193 (2009)MATHView Article
  47. Bottou, L., Chapelle, O., DeCoste, D., Weston, J.: Trading convexity for scalability. In: Proceedings of the 23th International Conference on Machine Learning. Google Scholar Digital Library (2006)
  48. Shalev-Shwartz, S., Singer, Y., Srebro, N., Cotter, A.: Pegasos: primal estimated sub-gradient solver for svm. Math. Program. 127(1), 3–30 (2011)MathSciNetMATHView Article
  49. Yuan, G.X., Ho, C.H., Lin, C.J.: Recent advances of large-scale linear classification. Proc. IEEE. 100(9), 2584–2603 (2012)View Article
  50. Hsu, C.W., Lin, C.J.: A simple decomposition method for support vector machines. Mach. Learn. 46(1), 291–314 (2002)MathSciNetMATHView Article
  51. Joachims, T.: Making large-scale svm learning practical. Technical report (1998)
  52. Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 130–136. IEEE, New York (1997)View Article
  53. Platt, J.: Sequential minimal optimization: a fast algorithm for training support vector machines (1998)
  54. Saunders, C., Stitson, M.O., Weston, J., Bottou, L., Smola, A., et al.: Support vector machine-reference manual (1998)
  55. Zanni, L., Serafini, T., Zanghirati, G., Bennett, K.P., Parrado-Hernández, E.: Parallel software for training large scale support vector machines on multiprocessor systems. J. Mach. Learn. Res. 7(54), 1467–1492 (2006)MathSciNetMATH
  56. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming, vol. 1. Springer Science & Business Media, New York (2006)MATH
  57. Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. R Soc. 83, 4–415 (1909)MATH
  58. Khemchandani, R., Chandra, S., et al.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)MATHView Article
  59. Chen, P.H., Fan, R.E., Lin, C.J.: A study on smo-type decomposition methods for support vector machines. IEEE Trans. Neural Netw. 17(4), 893–908 (2006)View Article
  60. Burges, C.J.: A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Disc. 2(2), 121–167 (1998)View Article
  61. Ward Jr., J.H.: Hierarchical grouping to optimize an objective function. J. Am. Stat. Assoc. 58(301), 236–244 (1963)MathSciNetView Article
  62. Xue, H., Chen, S., Yang, Q.: Structural support vector machine. In: International Symposium on Neural Networks, pp. 501–511. Springer, New York (2008)
  63. Xue, H., Chen, S., Yang, Q.: Structural regularized support vector machine: a framework for structural large margin classifier. IEEE Trans. Neural Netw. 22(4), 573–587 (2011)View Article
  64. Yeung, D.S., Wang, D., Ng, W.W., Tsang, E.C., Wang, X.: Structured large margin machines: sensitive to data distributions. Mach. Learn. 68(2), 171–200 (2007)MATHView Article
  65. Salvador, S., Chan, P.: Determining the number of clusters/segments in hierarchical clustering/segmentation algorithms. In: 16th IEEE International Conference on Tools with Artificial Intelligence, pp. 576–584. IEEE, New York (2004)
  66. Gantmacher, F.R.: Matrix Theory. Chelsea, New York (1990)
  67. Schölkopf, B., Smola, A.J., Bach, F., et al.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge, MA (2002)
  68. Fan, R.E., Chen, P.H., Lin, C.J., Joachims, T.: Working set selection using second order information for training support vector machines. J. Mach. Learn. Res. 6(12), 1889–1918 (2005)MathSciNetMATH
  69. Chang, C.C., Lin, C.J.: Libsvm: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 1–27 (2011)View Article
  70. Hsieh, C.J., Si, S., Dhillon, I.: A divide-and-conquer solver for kernel support vector machines. In: International Conference on Machine Learning, pp. 566–574. PMLR, New York (2014)
  71. Mangasarian, O.L.: Nonlinear Programming. SIAM, Philadelphia, PA (1994)MATHView Article
  72. Shao, Y.H., Deng, N.Y.: A coordinate descent margin based-twin support vector machine for classification. Neural Netw. 25, 114–121 (2012)MATHView Article
  73. Peng, X.: Tpmsvm: a novel twin parametric-margin support vector machine for pattern recognition. Pattern Recogn. 44(10–11), 2678–2692 (2011)MATHView Article
  74. Maron, O., Lozano-Pérez, T.: A framework for multiple-instance learning. In: Advances in Neural Information Processing Systems, pp. 570–576 (1998)
  75. Mangasarian, O.L., Wild, E.W.: Multiple instance classification via successive linear programming. J. Optim. Theory Appl. 137(3), 555–568 (2008)MathSciNetMATHView Article
  76. Tikhonov, A.N.: Regularization of incorrectly posed problems. Soviet Math. Doklady. 4(6), 1624–1627 (1963)MATH
  77. Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 7(85), 2399–2434 (2006)MathSciNetMATH
  78. Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and support vector machines. Adv. Comput. Math. 13(1), 1–50 (2000)MathSciNetMATHView Article
  79. Belkin, M., Niyogi, P.: Towards a theoretical foundation for laplacian-based manifold methods. J. Comput. Syst. Sci. 74(8), 1289–1308 (2008)MathSciNetMATHView Article
  80. Chang, K.W., Hsieh, C.J., Lin, C.J.: Coordinate descent method for large-scale l2-loss linear support vector machines. J. Mach. Learn. Res. 9(7) (2008)
  81. Chapelle, O.: Training a support vector machine in the primal. Neural Comput. 19(5), 1155–1178 (2007)MathSciNetMATHView Article
  82. Cucker, F., Zhou, D.X.: Learning Theory: An Approximation Theory Viewpoint, vol. 24. Cambridge University Press, Cambridge (2007)MATHView Article
  83. Gnecco, G., Sanguineti, M.: Regularization techniques and suboptimal solutions to optimization problems in learning from data. Neural Comput. 22(3), 793–829 (2010)MathSciNetMATHView Article
  84. Melacci, S., Belkin, M.: Laplacian support vector machines trained in the primal. J. Mach. Learn. Res. 12(3), 1149–1184 (2011)MathSciNetMATH
  85. Wang, L., Jia, H., Li, J.: Training robust support vector machine with smooth ramp loss in the primal space. Neurocomputing. 71(13–15), 3020–3025 (2008)View Article
  86. Steinwart, I.: Sparseness of support vector machines. J. Mach. Learn. Res. 4(Nov), 1071–1105 (2003)MathSciNetMATH
  87. Weston, J., Collobert, R., Sinz, F., Bottou, L., Vapnik, V.: Inference with the universum. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 1009–1016 (2006)View Article
  88. Yuille, A.L., Rangarajan, A.: The concave-convex procedure. Neural Comput. 15(4), 915–936 (2003)MATHView Article
  89. Tao, P.D., et al.: The dc (difference of convex functions) programming and dca revisited with dc models of real world nonconvex optimization problems. Ann. Oper. Res. 133(1–4), 23–46 (2005)MathSciNetMATH
  90. Vapnik, V.: Estimation of Dependences Based on Empirical Data. Springer Science & Business Media, New York (2006)MATHView Article
  91. Qi, Z., Tian, Y., Shi, Y.: Twin support vector machine with universum data. Neural Netw. 36, 112–119 (2012)MATHView Article
  92. Tian, Y., Qi, Z., Ju, X., Shi, Y., Liu, X.: Nonparallel support vector machines for pattern classification. IEEE Trans. Cybernetics. 44(7), 1067–1079 (2013)View Article
  93. Guan, N., Tao, D., Luo, Z., Shawe-Taylor, J.: Mahnmf: Manhattan non-negative matrix factorization. arXiv preprint arXiv:1207.3438 (2012)
  94. Tao, D., Tang, X., Li, X., Wu, X.: Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. IEEE Trans. Pattern Anal. Mach. Intell. 28(7), 1088–1099 (2006)View Article
  95. Zhou, T., Tao, D., Wu, X.: Nesvm: a fast gradient method for support vector machines. In: 2010 IEEE International Conference on Data Mining, pp. 679–688. IEEE, New York (2010)View Article
  96. Luo, Y., Tao, D., Geng, B., Xu, C., Maybank, S.J.: Manifold regularized multitask learning for semi-supervised multilabel image classification. IEEE Trans. Image Process. 22(2), 523–536 (2013)MathSciNetMATHView Article
Metadaten
Titel
Support Vector Machine Classification
verfasst von
Yong Shi
Copyright-Jahr
2022
Verlag
Springer Nature Singapore
DOI
https://doi.org/10.1007/978-981-16-3607-3_3

Premium Partner