Skip to main content
Erschienen in: Neural Computing and Applications 1/2013

01.07.2013 | Original Article

Improvement of the kernel minimum squared error model for fast feature extraction

verfasst von: Jinghua Wang, Peng Wang, Qin Li, Jane You

Erschienen in: Neural Computing and Applications | Ausgabe 1/2013

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The kernel minimum squared error (KMSE) expresses the feature extractor as a linear combination of all the training samples in the high-dimensional kernel space. To extract a feature from a sample, KMSE should calculate as many kernel functions as the training samples. Thus, the computational efficiency of the KMSE-based feature extraction procedure is inversely proportional to the size of the training sample set. In this paper, we propose an efficient kernel minimum squared error (EKMSE) model for two-class classification. The proposed EKMSE expresses each feature extractor as a linear combination of nodes, which are a small portion of the training samples. To extract a feature from a sample, EKMSE only needs to calculate as many kernel functions as the nodes. As the nodes are commonly much fewer than the training samples, EKMSE is much faster than KMSE in feature extraction. The EKMSE can achieve the same training accuracy as the standard KMSE. Also, EKMSE avoids the overfitting problem. We implement the EKMSE model using two algorithms. Experimental results show the feasibility of the EKMSE model.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Muller KR, Mika S, Ratsch G, Tsuda K, Scholkopf B (2001) An introduction to kernel-based learning algorithms. IEEE Trans Neural Netw 12(2):181–201CrossRef Muller KR, Mika S, Ratsch G, Tsuda K, Scholkopf B (2001) An introduction to kernel-based learning algorithms. IEEE Trans Neural Netw 12(2):181–201CrossRef
2.
Zurück zum Zitat Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999CrossRef Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999CrossRef
3.
Zurück zum Zitat Kim KI, Jung K, Kim HJ (2002) Face recognition using kernel principal component analysis. IEEE Signal Process Lett 9(2):40–42CrossRef Kim KI, Jung K, Kim HJ (2002) Face recognition using kernel principal component analysis. IEEE Signal Process Lett 9(2):40–42CrossRef
4.
Zurück zum Zitat Mika S, Ratsch G, Weston J, Scholkopf B, Mullers KR (1999) Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE signal processing society workshop neural networks for signal processing IX, pp 41–48 Mika S, Ratsch G, Weston J, Scholkopf B, Mullers KR (1999) Fisher discriminant analysis with kernels. In: Proceedings of the 1999 IEEE signal processing society workshop neural networks for signal processing IX, pp 41–48
5.
Zurück zum Zitat Xu J, Zhang X, Li Y (2001) Kernel MSE algorithm: a unified framework for KFD, LS-SVM and KRR, in Neural Networks, 2001. In: Proceedings. IJCNN ‘01. International joint conference on, vol 2. pp 1486–1491 Xu J, Zhang X, Li Y (2001) Kernel MSE algorithm: a unified framework for KFD, LS-SVM and KRR, in Neural Networks, 2001. In: Proceedings. IJCNN ‘01. International joint conference on, vol 2. pp 1486–1491
6.
Zurück zum Zitat Saunders C, Gammerman A, Vovk V (1998) Ridge regression learning algorithm in dual variables. Presented at the proceedings of the 15th international conference on machine learning, vol 37. pp 515–521 Saunders C, Gammerman A, Vovk V (1998) Ridge regression learning algorithm in dual variables. Presented at the proceedings of the 15th international conference on machine learning, vol 37. pp 515–521
7.
Zurück zum Zitat Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300MathSciNetCrossRef Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300MathSciNetCrossRef
8.
Zurück zum Zitat Baudat G, Anouar F (2000) Generalized discriminant analysis using a kernel approach. Neural Comput 12(10):2385–2404CrossRef Baudat G, Anouar F (2000) Generalized discriminant analysis using a kernel approach. Neural Comput 12(10):2385–2404CrossRef
9.
Zurück zum Zitat Zhang C, Nie F, Xiang S (2010) A general kernelization framework for learning algorithms based on kernel PCA. Neurocomputing 73(4–6):959–967CrossRef Zhang C, Nie F, Xiang S (2010) A general kernelization framework for learning algorithms based on kernel PCA. Neurocomputing 73(4–6):959–967CrossRef
10.
Zurück zum Zitat Wang J, Li Q, You J, Zhao Q (2011) Fast kernel Fisher discriminant analysis via approximating the kernel principal component analysis. Neurocomputing 74(17):3313–3322CrossRef Wang J, Li Q, You J, Zhao Q (2011) Fast kernel Fisher discriminant analysis via approximating the kernel principal component analysis. Neurocomputing 74(17):3313–3322CrossRef
11.
Zurück zum Zitat Xu Y, Yang J-Y, Yang J (2004) A reformative kernel Fisher discriminant analysis. Pattern Recogn 37(6):1299–1302MATHCrossRef Xu Y, Yang J-Y, Yang J (2004) A reformative kernel Fisher discriminant analysis. Pattern Recogn 37(6):1299–1302MATHCrossRef
12.
Zurück zum Zitat Xu Y, Zhang D, Jin Z, Li M, Yang J-Y (2006) A fast kernel-based nonlinear discriminant analysis for multi-class problems. Pattern Recogn 39(6):1026–1033MATHCrossRef Xu Y, Zhang D, Jin Z, Li M, Yang J-Y (2006) A fast kernel-based nonlinear discriminant analysis for multi-class problems. Pattern Recogn 39(6):1026–1033MATHCrossRef
13.
Zurück zum Zitat Zhao Y-P, Du Z-H, Zhang Z-A, Zhang H-B (2011) A fast method of feature extraction for kernel MSE. Neurocomputing 74(10):1654–1663CrossRef Zhao Y-P, Du Z-H, Zhang Z-A, Zhang H-B (2011) A fast method of feature extraction for kernel MSE. Neurocomputing 74(10):1654–1663CrossRef
14.
Zurück zum Zitat Zhu Q (2011) Reformative nonlinear feature extraction using kernel MSE. Neurocomputing 73(16–18):3334–3337 Zhu Q (2011) Reformative nonlinear feature extraction using kernel MSE. Neurocomputing 73(16–18):3334–3337
15.
Zurück zum Zitat Xu Y, Yang J-Y, Lu J-F (2005) An efficient kernel-based nonlinear regression method for two-class classification, in machine learning and cybernetics, 2005. In: Proceedings of 2005 international conference on, vol 7. pp 4442–4445 Xu Y, Yang J-Y, Lu J-F (2005) An efficient kernel-based nonlinear regression method for two-class classification, in machine learning and cybernetics, 2005. In: Proceedings of 2005 international conference on, vol 7. pp 4442–4445
16.
Zurück zum Zitat Zheng Y-J, Yang J, Yang J-Y, Wu X-J (2006) A reformative kernel Fisher discriminant algorithm and its application to face recognition. Neurocomputing 69(13–15):1806–1810CrossRef Zheng Y-J, Yang J, Yang J-Y, Wu X-J (2006) A reformative kernel Fisher discriminant algorithm and its application to face recognition. Neurocomputing 69(13–15):1806–1810CrossRef
17.
Zurück zum Zitat Zhao Y-P, Sun J-G, Du Z-H, Zhang Z-A, Zhang H-B (2011) Pruning least objective contribution in KMSE. Neurocomputing 74(17):3009–3018CrossRef Zhao Y-P, Sun J-G, Du Z-H, Zhang Z-A, Zhang H-B (2011) Pruning least objective contribution in KMSE. Neurocomputing 74(17):3009–3018CrossRef
18.
Zurück zum Zitat Zhao Y-P, Sun J-G (2009) Recursive reduced least squares support vector regression. Pattern Recogn 42(5):837–842MATHCrossRef Zhao Y-P, Sun J-G (2009) Recursive reduced least squares support vector regression. Pattern Recogn 42(5):837–842MATHCrossRef
Metadaten
Titel
Improvement of the kernel minimum squared error model for fast feature extraction
verfasst von
Jinghua Wang
Peng Wang
Qin Li
Jane You
Publikationsdatum
01.07.2013
Verlag
Springer-Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 1/2013
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-012-0813-9

Weitere Artikel der Ausgabe 1/2013

Neural Computing and Applications 1/2013 Zur Ausgabe