Skip to main content

2015 | OriginalPaper | Buchkapitel

9. Kernel Methods

verfasst von : Francesco Camastra, Alessandro Vinciarelli

Erschienen in: Machine Learning for Audio, Image and Video Analysis

Verlag: Springer London

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Notions of calculus. What the reader should know to understand this chapter \(\bullet \) Notions of calculus. \(\bullet \) Chapters 56, and 7. \(\bullet \) Although the reading of Appendix D is not mandatory, it represents an advantage for the chapter understanding.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
If the input dimensionality is higher than 2, the line has to be replaced with a plane or a hyperplane.
 
2
The number of solutions is (at least) \(\infty ^1\).
 
3
The function signum sgn(u) is defined as follows: \(sgn(u)=1\) if \(u>0\); \(sgn(u)=-1\) if \(u<0\); \(sgn(u)=0\) if \(u=0\).
 
4
This convention is adopted in the rest of the chapter.
 
5
The term regularization constant is motivated in Sect. 9.3.6.
 
6
\(\theta (\beta )\) is 1 if \(\beta >0\), 0 otherwise.
 
7
In [102] the continuity requirement is replaced with the stability.
 
8
\(\delta _{ij}\) is 1 if \(i=j\), 0 otherwise.
 
9
\(\mathrm{MATLAB}^{\copyright}\) is a registered trademark of The Mathworks, Inc.
 
Literatur
1.
Zurück zum Zitat M. Aizerman, E. Braverman, and L. Rozonoer. Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Controld, 25:821–837, 1964. M. Aizerman, E. Braverman, and L. Rozonoer. Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Controld, 25:821–837, 1964.
2.
Zurück zum Zitat F. R. Bach and M. I. Jordan. Learning spectral clustering. Technical report, EECS Department, University of California, 2003. F. R. Bach and M. I. Jordan. Learning spectral clustering. Technical report, EECS Department, University of California, 2003.
3.
Zurück zum Zitat A. Barla, E. Franceschi, F. Odone, and F. Verri. Image kernels. In Proceedings of SVM2002, pages 83–96, 2002. A. Barla, E. Franceschi, F. Odone, and F. Verri. Image kernels. In Proceedings of SVM2002, pages 83–96, 2002.
4.
Zurück zum Zitat A. Ben Hur, D. Horn, H.T. Siegelmann, and V. Vapnik. A support vector method for clustering. In Advances in Neural Information and Processing Systems, volume 12, pages 125–137, 2000. A. Ben Hur, D. Horn, H.T. Siegelmann, and V. Vapnik. A support vector method for clustering. In Advances in Neural Information and Processing Systems, volume 12, pages 125–137, 2000.
5.
Zurück zum Zitat A. Ben-Hur, D. Horn, H.T. Siegelmann, and V. Vapnik. Support vector clustering. Journal of Machine Learning Research, 2(2):125–137, 2001. A. Ben-Hur, D. Horn, H.T. Siegelmann, and V. Vapnik. Support vector clustering. Journal of Machine Learning Research, 2(2):125–137, 2001.
6.
Zurück zum Zitat Y. Bengio, O. Dellaleau, N. Le Roux, J.F. Paiement, Vincent. P., and M. Ouimet. Learning eigenfunction links spectral embedding and kernel pca. Neural Computation, 16(10):2197–2219, 2004. Y. Bengio, O. Dellaleau, N. Le Roux, J.F. Paiement, Vincent. P., and M. Ouimet. Learning eigenfunction links spectral embedding and kernel pca. Neural Computation, 16(10):2197–2219, 2004.
7.
Zurück zum Zitat Y. Bengio, Vincent. P., and J.F. Paiement. Spectral clustering and kernel pca are learning eigenfunctions. Technical report, CIRANO, 2003. Y. Bengio, Vincent. P., and J.F. Paiement. Spectral clustering and kernel pca are learning eigenfunctions. Technical report, CIRANO, 2003.
8.
Zurück zum Zitat C. Berg, J.P.R. Christensen, and P. Ressel. Harmonic analysis on semigroups. Springer-Verlag, 1984. C. Berg, J.P.R. Christensen, and P. Ressel. Harmonic analysis on semigroups. Springer-Verlag, 1984.
9.
Zurück zum Zitat C.M. Bishop. Neural Networks for Pattern Recognition. Cambridge University Press, 1995. C.M. Bishop. Neural Networks for Pattern Recognition. Cambridge University Press, 1995.
10.
Zurück zum Zitat M. Brand and K. Huang. A unifying theorem for spectral embedding and clustering. In Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, 2003. M. Brand and K. Huang. A unifying theorem for spectral embedding and clustering. In Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics, 2003.
11.
Zurück zum Zitat L. M. Bregman. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Computational Mathematics and Mathematical Physics, 7:200–217, 1967. L. M. Bregman. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Computational Mathematics and Mathematical Physics, 7:200–217, 1967.
12.
Zurück zum Zitat F. Camastra and A. Verri. A novel kernel method for clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(5):801–805, 2005. F. Camastra and A. Verri. A novel kernel method for clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(5):801–805, 2005.
13.
Zurück zum Zitat N. Cancedda, E. Gaussier, C. Goutte, and J.-M. Renders. Word-sequence kernels. Journal of Machine Learning Research, 3(1):1059–1082, 2003. N. Cancedda, E. Gaussier, C. Goutte, and J.-M. Renders. Word-sequence kernels. Journal of Machine Learning Research, 3(1):1059–1082, 2003.
14.
Zurück zum Zitat S. Canu, Y. Grandvalet, V. Guigue, and A. Rakotomamonjy. SVM and kernel methods Matlab toolbox. Technical report, Perception Systemes et Information, INSA de Rouen, 2005. S. Canu, Y. Grandvalet, V. Guigue, and A. Rakotomamonjy. SVM and kernel methods Matlab toolbox. Technical report, Perception Systemes et Information, INSA de Rouen, 2005.
15.
Zurück zum Zitat Y. Censor. Row-action methods for huge and sparse systems and their applications. SIAM Reviews, 23(4):444–467, 1981. Y. Censor. Row-action methods for huge and sparse systems and their applications. SIAM Reviews, 23(4):444–467, 1981.
16.
Zurück zum Zitat Y. Censor and A. Lent. An iterative row-action method for interval convex programming. Journal of Optimization Theory and Application, 34(3):321–353, 1981. Y. Censor and A. Lent. An iterative row-action method for interval convex programming. Journal of Optimization Theory and Application, 34(3):321–353, 1981.
17.
Zurück zum Zitat P.K. Chan, M. Schlag, and J.Y. Zien. Spectral k-way radio-cut partitioning and clustering. In Proceedings of the 1993 International Symposium on Research on Integrated Systems, pages 123–142. MIT Press, 1993. P.K. Chan, M. Schlag, and J.Y. Zien. Spectral k-way radio-cut partitioning and clustering. In Proceedings of the 1993 International Symposium on Research on Integrated Systems, pages 123–142. MIT Press, 1993.
18.
Zurück zum Zitat J.H. Chiang. A new kernel-based fuzzy clustering approach: support vector clustering with cell growing. IEEE Transactions on Fuzzy Systems, 11(4):518–527, 2003. J.H. Chiang. A new kernel-based fuzzy clustering approach: support vector clustering with cell growing. IEEE Transactions on Fuzzy Systems, 11(4):518–527, 2003.
19.
Zurück zum Zitat F.R.K. Chung. Spectral Graph Theory. American Mathematical Society, 1997. F.R.K. Chung. Spectral Graph Theory. American Mathematical Society, 1997.
20.
Zurück zum Zitat R. Collobert and S. Bengio. SVMTorch: Support vector machines for large-scale regression problems. Journal of Machine Learning Research, 1(2):143–160, 2001. R. Collobert and S. Bengio. SVMTorch: Support vector machines for large-scale regression problems. Journal of Machine Learning Research, 1(2):143–160, 2001.
21.
Zurück zum Zitat R. Collobert, S. Bengio, and J. Mariethoz. Torch: a modular machine learning software library. Technical report, IDIAP, 2002. R. Collobert, S. Bengio, and J. Mariethoz. Torch: a modular machine learning software library. Technical report, IDIAP, 2002.
22.
Zurück zum Zitat C. Cortes and V. Vapnik. Support vector networks. Machine Learning, 20(3):1–25, 1995. C. Cortes and V. Vapnik. Support vector networks. Machine Learning, 20(3):1–25, 1995.
23.
Zurück zum Zitat N. Cressie. Statistics for Spatial Data. John Wiley, 1993. N. Cressie. Statistics for Spatial Data. John Wiley, 1993.
24.
Zurück zum Zitat N. Cristianini, J.S. Taylor, and J. S. Kandola. Spectral kernel methods for clustering. In Advances in Neural Information Processing Systems 14, pages 649–655. MIT Press, 2001. N. Cristianini, J.S. Taylor, and J. S. Kandola. Spectral kernel methods for clustering. In Advances in Neural Information Processing Systems 14, pages 649–655. MIT Press, 2001.
25.
Zurück zum Zitat A.P. Dempster, N.M. Laird, and D.B. Rubin. Maximum likelihood from incomplete data via the em algorithm. Journal Royal Statistical Society, 39(1):1–38, 1977. A.P. Dempster, N.M. Laird, and D.B. Rubin. Maximum likelihood from incomplete data via the em algorithm. Journal Royal Statistical Society, 39(1):1–38, 1977.
26.
Zurück zum Zitat I.S. Dhillon, Y. Guan, and B. Kullis. Kernel k-means: spectral clustering and normalized cuts. In Proceedings of the \(10^{th}\) ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 551–556. ACM Press, 2004. I.S. Dhillon, Y. Guan, and B. Kullis. Kernel k-means: spectral clustering and normalized cuts. In Proceedings of the \(10^{th}\) ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 551–556. ACM Press, 2004.
27.
Zurück zum Zitat I.S. Dhillon, Y. Guan, and B. Kullis. A unified view of kernel k-means, spectral clustering and graph partitioning. Technical report, UTCS, 2005. I.S. Dhillon, Y. Guan, and B. Kullis. A unified view of kernel k-means, spectral clustering and graph partitioning. Technical report, UTCS, 2005.
28.
Zurück zum Zitat I.S. Dhillon, Y. Guan, and B. Kullis. Weighted graph cuts without eigenvectors: A multilevel approach. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(11):1944–1957, 2007. I.S. Dhillon, Y. Guan, and B. Kullis. Weighted graph cuts without eigenvectors: A multilevel approach. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(11):1944–1957, 2007.
29.
Zurück zum Zitat R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. John Wiley, 2001. R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. John Wiley, 2001.
30.
Zurück zum Zitat T. Evgeniou, M. Pontil, and T. Poggio. Regularization networks and support vector machines. Advances in Computational Mathematics, 13(1):1–50, 2001. T. Evgeniou, M. Pontil, and T. Poggio. Regularization networks and support vector machines. Advances in Computational Mathematics, 13(1):1–50, 2001.
31.
Zurück zum Zitat P.-H. Fan, R.-E. andChen and C.-J. Lin. Working set selection using the second order information for training SVM. Journal of Machine Learning Research, 6:1889–1918, 2005. P.-H. Fan, R.-E. andChen and C.-J. Lin. Working set selection using the second order information for training SVM. Journal of Machine Learning Research, 6:1889–1918, 2005.
32.
Zurück zum Zitat P. Fermat. Methodus ad disquirendam maximam et minimam. In Oeuvres de Fermat. MIT Press, 1891 (First Edition 1679). P. Fermat. Methodus ad disquirendam maximam et minimam. In Oeuvres de Fermat. MIT Press, 1891 (First Edition 1679).
33.
Zurück zum Zitat M. Ferris and T. Munson. Interior point method for massive support vector machines. Technical report, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2000. M. Ferris and T. Munson. Interior point method for massive support vector machines. Technical report, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2000.
34.
Zurück zum Zitat M. Ferris and T. Munson. Semi-smooth support vector machines. Technical report, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2000. M. Ferris and T. Munson. Semi-smooth support vector machines. Technical report, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2000.
35.
Zurück zum Zitat M. Fiedler. Algebraic connectivity of graphs. Czechoslovak Math. J., 23(98):298–305, 1973.MathSciNet M. Fiedler. Algebraic connectivity of graphs. Czechoslovak Math. J., 23(98):298–305, 1973.MathSciNet
36.
Zurück zum Zitat M. Filippone, F. Camastra, F. Masulli, and S. Rovetta. A survey of spectral and kernel methods for clustering. Pattern Recognition, 41(1):176–190, 2008. M. Filippone, F. Camastra, F. Masulli, and S. Rovetta. A survey of spectral and kernel methods for clustering. Pattern Recognition, 41(1):176–190, 2008.
37.
Zurück zum Zitat I. Fischer and I. Poland. New methods for spectral clustering. Technical report, IDSIA, 2004. I. Fischer and I. Poland. New methods for spectral clustering. Technical report, IDSIA, 2004.
38.
Zurück zum Zitat R. A. Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7(2):179–188, 1936. R. A. Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7(2):179–188, 1936.
39.
Zurück zum Zitat J. Friedman. Regularized discriminant analysis. Journal of the American Statistical Association, 84(405):165–175, 1989. J. Friedman. Regularized discriminant analysis. Journal of the American Statistical Association, 84(405):165–175, 1989.
40.
Zurück zum Zitat T.T. Friess, N. Cristianini, and C. Campbell. The kernel adatron algorithm: a fast and simple learning procedure for support vector machines. In Proceedings of \(15^{th}\) International Conference on Machine Learning, pages 188–196. Morgan Kaufman Publishers, 1998. T.T. Friess, N. Cristianini, and C. Campbell. The kernel adatron algorithm: a fast and simple learning procedure for support vector machines. In Proceedings of \(15^{th}\) International Conference on Machine Learning, pages 188–196. Morgan Kaufman Publishers, 1998.
41.
Zurück zum Zitat K. Fukunaga. An Introduction to Statistical Pattern Recognition. Academic Press, 1990. K. Fukunaga. An Introduction to Statistical Pattern Recognition. Academic Press, 1990.
42.
Zurück zum Zitat T. Gärtner, J.W. Lloyd, and P.A. Flach. Kernels and distances for structured data. Machine Learning, 57(3):205–232, 2004. T. Gärtner, J.W. Lloyd, and P.A. Flach. Kernels and distances for structured data. Machine Learning, 57(3):205–232, 2004.
43.
Zurück zum Zitat M. Girolami. Mercer kernel based clustering in feature space. IEEE Transactions on Neural Networks, 13(3):780–784, 2002. M. Girolami. Mercer kernel based clustering in feature space. IEEE Transactions on Neural Networks, 13(3):780–784, 2002.
44.
Zurück zum Zitat F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural network architectures. Neural Computation, 7(2):219–269, 1995. F. Girosi, M. Jones, and T. Poggio. Regularization theory and neural network architectures. Neural Computation, 7(2):219–269, 1995.
45.
Zurück zum Zitat G.H. Golub and C.F.V. Loan. Matrix computation. The Johns Hopkins University Press, 1996. G.H. Golub and C.F.V. Loan. Matrix computation. The Johns Hopkins University Press, 1996.
46.
Zurück zum Zitat T. Graepel and K. Obermayer. Fuzzy topographic kernel clustering. In Proceedings of the Fifth GI Workshop Fuzzy Neuro Systems’98, pages 90–97, 1998. T. Graepel and K. Obermayer. Fuzzy topographic kernel clustering. In Proceedings of the Fifth GI Workshop Fuzzy Neuro Systems’98, pages 90–97, 1998.
47.
Zurück zum Zitat J. Hadamard. Sur les problemes aux derivees partielles et leur signification physique. Bull. Univ. Princeton, 13:49–52, 1902. J. Hadamard. Sur les problemes aux derivees partielles et leur signification physique. Bull. Univ. Princeton, 13:49–52, 1902.
48.
Zurück zum Zitat T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Springer-Verlag, 2001. T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Springer-Verlag, 2001.
49.
Zurück zum Zitat R. Herbrich. Learning Kernel Classifiers: Theory and Algorithms. MIT Press, 2004. R. Herbrich. Learning Kernel Classifiers: Theory and Algorithms. MIT Press, 2004.
50.
Zurück zum Zitat R. Inokuchi and S. Miyamoto. LVQ clustering and SOM using a kernel function. In Proceedings of IEEE International Conference on Fuzzy Systems, pages 367–373, 2004. R. Inokuchi and S. Miyamoto. LVQ clustering and SOM using a kernel function. In Proceedings of IEEE International Conference on Fuzzy Systems, pages 367–373, 2004.
51.
Zurück zum Zitat T. Joachims. Making large-scale SVM learning practical. In Advances in Kernel Methods, pages 169–184. MIT Press, 1999. T. Joachims. Making large-scale SVM learning practical. In Advances in Kernel Methods, pages 169–184. MIT Press, 1999.
52.
Zurück zum Zitat T. Joachims, N. Cristianini, and J. Shawe-Taylor. Composite kernels for hypertext classification. In Proceedings of the \(18^{th}\) International Conference on Machine Learning, pages 250–257. IEEE Press, 2001. T. Joachims, N. Cristianini, and J. Shawe-Taylor. Composite kernels for hypertext classification. In Proceedings of the \(18^{th}\) International Conference on Machine Learning, pages 250–257. IEEE Press, 2001.
53.
Zurück zum Zitat R. Kannan, S. Vempala, and A. Vetta. On clusterings: Good, bad and spectral. In Proceedings of the 41\(^{st}\) Annual Symposium on the Foundation of Computer Science, pages 367–380. IEEE Press, 2000. R. Kannan, S. Vempala, and A. Vetta. On clusterings: Good, bad and spectral. In Proceedings of the 41\(^{st}\) Annual Symposium on the Foundation of Computer Science, pages 367–380. IEEE Press, 2000.
54.
Zurück zum Zitat A. Karatzoglou, A. Smola, K. Hornik, and A. Zeleis. kernlab- an s4 package for kernel methods in r. Journal of Statistical Software, 11(9):1–20, 2004. A. Karatzoglou, A. Smola, K. Hornik, and A. Zeleis. kernlab- an s4 package for kernel methods in r. Journal of Statistical Software, 11(9):1–20, 2004.
55.
Zurück zum Zitat S. Keerthi, S. Shevde, C. Bhattacharyya, and K. Murthy. Improvements to platt’s smo algorithm for SVM classifier design. Technical report, Department of CSA, Bangalore, India,, 1999. S. Keerthi, S. Shevde, C. Bhattacharyya, and K. Murthy. Improvements to platt’s smo algorithm for SVM classifier design. Technical report, Department of CSA, Bangalore, India,, 1999.
56.
Zurück zum Zitat S. Keerthi, S. Shevde, C. Bhattacharyya, and K. Murthy. A fast iterative nearest point algorithm for support vector machine design. IEEE Transaction on Neural Networks, 11(1):124–136, 2000. S. Keerthi, S. Shevde, C. Bhattacharyya, and K. Murthy. A fast iterative nearest point algorithm for support vector machine design. IEEE Transaction on Neural Networks, 11(1):124–136, 2000.
57.
Zurück zum Zitat B.W. Kernighan and S. Lin. An efficient heuristic procedure for partitioning graphs. Bell System Technical Journal, 49(1):291–307, 1970. B.W. Kernighan and S. Lin. An efficient heuristic procedure for partitioning graphs. Bell System Technical Journal, 49(1):291–307, 1970.
58.
Zurück zum Zitat G.A. Korn and T.M. Korn. Mathematical Handbook for Scientists and Engineers. Mc Graw-Hill, 1968. G.A. Korn and T.M. Korn. Mathematical Handbook for Scientists and Engineers. Mc Graw-Hill, 1968.
59.
Zurück zum Zitat R. Krishnapuram and J.M. Keller. A possibilistic approach to clustering. IEEE Transactions on Fuzzy Sets, 1(2):98–110, 1993. R. Krishnapuram and J.M. Keller. A possibilistic approach to clustering. IEEE Transactions on Fuzzy Sets, 1(2):98–110, 1993.
60.
Zurück zum Zitat R. Krishnapuram and J.M. Keller. The possibilistic c-means algorithms: insight and recommandations. IEEE Transactions on Fuzzy Sets, 4(3):385–393, 1996. R. Krishnapuram and J.M. Keller. The possibilistic c-means algorithms: insight and recommandations. IEEE Transactions on Fuzzy Sets, 4(3):385–393, 1996.
61.
Zurück zum Zitat H.W. Kuhn and A.W. Tucker. Nonlinear programming. In Proceedings of \(2^{nd}\) Berkeley Symposium on Mathematical Statistics and Probabilistics, pages 367–380. University of California Press, 1951. H.W. Kuhn and A.W. Tucker. Nonlinear programming. In Proceedings of \(2^{nd}\) Berkeley Symposium on Mathematical Statistics and Probabilistics, pages 367–380. University of California Press, 1951.
62.
Zurück zum Zitat J.-L. Lagrange. Mecanique analytique. Chez La Veuve Desaint Libraire, 1788.<!– Missing/Wrong Year –> J.-L. Lagrange. Mecanique analytique. Chez La Veuve Desaint Libraire, 1788.<!– Missing/Wrong Year –>
63.
Zurück zum Zitat D. Lee. An improved cluster labeling method for support vector clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(3):461–464, 2005. D. Lee. An improved cluster labeling method for support vector clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(3):461–464, 2005.
64.
Zurück zum Zitat C. Leslie, E. Eskin, A. Cohen, J. Weston, and A. Noble. Mismatch string kernels for discriminative protein classification. Bioinformatics, 20(4):467–476, 2004. C. Leslie, E. Eskin, A. Cohen, J. Weston, and A. Noble. Mismatch string kernels for discriminative protein classification. Bioinformatics, 20(4):467–476, 2004.
65.
Zurück zum Zitat D. Lueberger. Linear and Nonlinear Programming. Addison-Wesley, 1984. D. Lueberger. Linear and Nonlinear Programming. Addison-Wesley, 1984.
66.
Zurück zum Zitat D. Macdonald and C. Fyfe. The kernel self-organizing map. In Fourth International Conference on Knowledge-based Intelligent Engineering Systems and Allied Technologies, pages 317–320, 2000. D. Macdonald and C. Fyfe. The kernel self-organizing map. In Fourth International Conference on Knowledge-based Intelligent Engineering Systems and Allied Technologies, pages 317–320, 2000.
67.
Zurück zum Zitat D.J.C. MacKay. A practical bayesian framework for backpropagation networks. Neural Computation, 4(3):448–472, 1992. D.J.C. MacKay. A practical bayesian framework for backpropagation networks. Neural Computation, 4(3):448–472, 1992.
68.
Zurück zum Zitat O.L. Mangasarian. Linear and non-linear separation of patterns by linear programming. Operations Research, 13(3):444–452, 1965. O.L. Mangasarian. Linear and non-linear separation of patterns by linear programming. Operations Research, 13(3):444–452, 1965.
69.
Zurück zum Zitat O.L. Mangasarian and D. Musicant. Lagrangian support vector regression. Technical report, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, June 2000. O.L. Mangasarian and D. Musicant. Lagrangian support vector regression. Technical report, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, June 2000.
70.
Zurück zum Zitat G. Matheron. Principles of geostatistics. Economic Geology, 58:1246–1266, 1963. G. Matheron. Principles of geostatistics. Economic Geology, 58:1246–1266, 1963.
71.
Zurück zum Zitat M. Meila and J. Shi. Spectral methods for clustering. In Advances in Neural Information Processing Systems 12, pages 873–879. MIT Press, 2000. M. Meila and J. Shi. Spectral methods for clustering. In Advances in Neural Information Processing Systems 12, pages 873–879. MIT Press, 2000.
72.
Zurück zum Zitat S. Mika, G. Rätsch, J. Weston, B. Schölkopf, and K.R. Müller. Fisher discriminant analysis with kernels. In Proceedings of IEEE Neural Networks for Signal Processing Workshop, pages 41–48. IEEE Press, 2001. S. Mika, G. Rätsch, J. Weston, B. Schölkopf, and K.R. Müller. Fisher discriminant analysis with kernels. In Proceedings of IEEE Neural Networks for Signal Processing Workshop, pages 41–48. IEEE Press, 2001.
73.
Zurück zum Zitat M.L. Minsky and S.A. Papert. Perceptrons. MIT Press, 1969. M.L. Minsky and S.A. Papert. Perceptrons. MIT Press, 1969.
74.
Zurück zum Zitat J. Moody and C. Darken. Fast learning in networks of locally-tuned processing units. Neural Computation, 1(2):281–294, 1989. J. Moody and C. Darken. Fast learning in networks of locally-tuned processing units. Neural Computation, 1(2):281–294, 1989.
75.
Zurück zum Zitat R. Neal. Bayesian Learning in Neural Networks. Springer-Verlag, 1996. R. Neal. Bayesian Learning in Neural Networks. Springer-Verlag, 1996.
76.
Zurück zum Zitat A.Y. Ng, M.I. Jordan, and Y. Weiss. On spectral clustering: Analysis and an algorithm. In Advances in Neural Information Processing Systems 14, pages 849–856. MIT Press, 2002. A.Y. Ng, M.I. Jordan, and Y. Weiss. On spectral clustering: Analysis and an algorithm. In Advances in Neural Information Processing Systems 14, pages 849–856. MIT Press, 2002.
77.
Zurück zum Zitat E. Osuna, R. Freund, and F. Girosi. An improved training algorithm for support vector machines. In Neural Networks for Signal Processing VII, Proceedings of the 1997 IEEE Workshop, pages 276–285. IEEE Press, 1997. E. Osuna, R. Freund, and F. Girosi. An improved training algorithm for support vector machines. In Neural Networks for Signal Processing VII, Proceedings of the 1997 IEEE Workshop, pages 276–285. IEEE Press, 1997.
78.
Zurück zum Zitat E. Osuna and F. Girosi. Reducing the run-time complexity in support vector machines. In Advances in Kernel Methods, pages 271–284. MIT Press, 1999. E. Osuna and F. Girosi. Reducing the run-time complexity in support vector machines. In Advances in Kernel Methods, pages 271–284. MIT Press, 1999.
79.
Zurück zum Zitat A. Paccanaro, C. Chennubhotla, J.A. Casbon, and M.A.S. Saqi. Spectral clustering of protein sequences. In Proceedings of International Joint Conference on Neural Networks, pages 3083–3088. IEEE Press, 2003. A. Paccanaro, C. Chennubhotla, J.A. Casbon, and M.A.S. Saqi. Spectral clustering of protein sequences. In Proceedings of International Joint Conference on Neural Networks, pages 3083–3088. IEEE Press, 2003.
80.
Zurück zum Zitat J.C. Platt. Fast training of support vector machines using sequential minimal optimization. In Advances in Kernel Methods, pages 185–208. MIT Press, 1999. J.C. Platt. Fast training of support vector machines using sequential minimal optimization. In Advances in Kernel Methods, pages 185–208. MIT Press, 1999.
81.
Zurück zum Zitat J.C. Platt, N. Cristianini, and J. Shawe-Taylor. Large margin dags for multiclass classification. In Advances in Neural Information Processing Systems 12, pages 547–553. MIT Press, 2000. J.C. Platt, N. Cristianini, and J. Shawe-Taylor. Large margin dags for multiclass classification. In Advances in Neural Information Processing Systems 12, pages 547–553. MIT Press, 2000.
82.
Zurück zum Zitat T. Poggio and F. Girosi. Networks for approximation and learning. Proceedings of the IEEE, 78(9):1481–1497, 1990. T. Poggio and F. Girosi. Networks for approximation and learning. Proceedings of the IEEE, 78(9):1481–1497, 1990.
83.
Zurück zum Zitat M. Pontil and A. Verri. Support vector machines for 3-d object recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(6):637–646, 1998. M. Pontil and A. Verri. Support vector machines for 3-d object recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(6):637–646, 1998.
84.
Zurück zum Zitat M.J.D. Powell. Radial basis functions for multivariable interpolation: A review. In Algorithms for Approximation, pages 143–167. Clarendon Press, 1987. M.J.D. Powell. Radial basis functions for multivariable interpolation: A review. In Algorithms for Approximation, pages 143–167. Clarendon Press, 1987.
85.
Zurück zum Zitat A.K. Qinand and P.N. Sugantham. Kernel neural gas algorithms with application to cluster analysis. In iCPR- 17th International Conference on Fuzzy Systems, pages 617–620. Clarendon Press, 2004. A.K. Qinand and P.N. Sugantham. Kernel neural gas algorithms with application to cluster analysis. In iCPR- 17th International Conference on Fuzzy Systems, pages 617–620. Clarendon Press, 2004.
86.
Zurück zum Zitat C.E. Rasmussen and C. Willims. Gaussian Processes for Machine Learning. MIT Press, 2006. C.E. Rasmussen and C. Willims. Gaussian Processes for Machine Learning. MIT Press, 2006.
87.
Zurück zum Zitat K. Rose. Deterministic annealing for clustering, compression, classification, regression, and related optimization problem. Proceedings of the IEEE, 86(11):2210–2239, 1998. K. Rose. Deterministic annealing for clustering, compression, classification, regression, and related optimization problem. Proceedings of the IEEE, 86(11):2210–2239, 1998.
88.
Zurück zum Zitat R. Rosipal and M. Girolami. An expectation maximization approach to nonlinear component analysis. Neural Computation, 13(3):505–510, 2001. R. Rosipal and M. Girolami. An expectation maximization approach to nonlinear component analysis. Neural Computation, 13(3):505–510, 2001.
89.
Zurück zum Zitat V. Roth, J. Laub, M. Kawanabe, and J.M. Buhmann. Optimal cluster preserving embedding of nonmetric proximity data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(12):1540–1551, 2003. V. Roth, J. Laub, M. Kawanabe, and J.M. Buhmann. Optimal cluster preserving embedding of nonmetric proximity data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(12):1540–1551, 2003.
90.
Zurück zum Zitat B. Schölkopf and A.J. Smola. Learning with Kernels. MIT Press, 2002. B. Schölkopf and A.J. Smola. Learning with Kernels. MIT Press, 2002.
91.
Zurück zum Zitat B. Schölkopf, A.J. Smola, and K.R. Muller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5):1299–1319, 1998. B. Schölkopf, A.J. Smola, and K.R. Muller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10(5):1299–1319, 1998.
92.
Zurück zum Zitat B. Schölkopf, A.J. Smola, and K.R. Muller. Nonlinear component analysis as a kernel eigenvalue problem. Technical report, Max Planck Institut für Biologische Kybernetik, 1998. B. Schölkopf, A.J. Smola, and K.R. Muller. Nonlinear component analysis as a kernel eigenvalue problem. Technical report, Max Planck Institut für Biologische Kybernetik, 1998.
93.
Zurück zum Zitat B. Schölkopf, R.C. Williamson, A.J. Smola, J. Shawe-Taylor, and J. Platt. Support vector method for novelty detection. In Advances in Neural Information Processing Systems 12, pages 526–532. MIT Press, 2000. B. Schölkopf, R.C. Williamson, A.J. Smola, J. Shawe-Taylor, and J. Platt. Support vector method for novelty detection. In Advances in Neural Information Processing Systems 12, pages 526–532. MIT Press, 2000.
94.
Zurück zum Zitat J. Shawe-Taylor and N. Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, 2004. J. Shawe-Taylor and N. Cristianini. Kernel Methods for Pattern Analysis. Cambridge University Press, 2004.
95.
Zurück zum Zitat J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8):888–905, 2000. J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8):888–905, 2000.
96.
Zurück zum Zitat D.M.J. Tax and R.P.W. Duin. Support vector domain description. Pattern Recognition Letters, 20(11–13):1191–1199, 1999. D.M.J. Tax and R.P.W. Duin. Support vector domain description. Pattern Recognition Letters, 20(11–13):1191–1199, 1999.
97.
Zurück zum Zitat A.N. Tikhonov. On solving ill-posed problem and method of regularization. Dokl. Acad. Nauk USSR, 153:501–504, 1963. A.N. Tikhonov. On solving ill-posed problem and method of regularization. Dokl. Acad. Nauk USSR, 153:501–504, 1963.
98.
Zurück zum Zitat A.N. Tikhonov and V.Y. Arsenin. Solution of ill-posed problems. W.H. Winston, 2002. A.N. Tikhonov and V.Y. Arsenin. Solution of ill-posed problems. W.H. Winston, 2002.
99.
Zurück zum Zitat I. Tsochantaridis, T. Hoffman, T. Joachims, and Y. Altun. Support vector learning for interdependent and structured output spaces. In Proceedings of ICML04. IEEE Press, 2004. I. Tsochantaridis, T. Hoffman, T. Joachims, and Y. Altun. Support vector learning for interdependent and structured output spaces. In Proceedings of ICML04. IEEE Press, 2004.
100.
Zurück zum Zitat C.J. Twining and C.J. Taylor. The use of kernel principal component analysis to model data distributions. Pattern Recognition, 36(1):217–227, 2003. C.J. Twining and C.J. Taylor. The use of kernel principal component analysis to model data distributions. Pattern Recognition, 36(1):217–227, 2003.
101.
Zurück zum Zitat V.N. Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, 1995. V.N. Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, 1995.
102.
Zurück zum Zitat V.N. Vapnik. Statistical Learning Theory. John Wiley, 1998. V.N. Vapnik. Statistical Learning Theory. John Wiley, 1998.
103.
Zurück zum Zitat V.N. Vapnik and A.Ya. Chervonenkis. A note on one class of perceptron. Automation and Remote Control, 25:103–109, 1964. V.N. Vapnik and A.Ya. Chervonenkis. A note on one class of perceptron. Automation and Remote Control, 25:103–109, 1964.
104.
Zurück zum Zitat V.N. Vapnik and A. Lerner. Pattern recognition using generalized portrait method. Automation and Remote Control, 24:774–780, 1963. V.N. Vapnik and A. Lerner. Pattern recognition using generalized portrait method. Automation and Remote Control, 24:774–780, 1963.
105.
Zurück zum Zitat S. Vishwanathan and A.J. Smola. Fast kernels for string and tree matching. In Advances in Neural Information Processing Systems 15, pages 569–576. MIT Press, 2003. S. Vishwanathan and A.J. Smola. Fast kernels for string and tree matching. In Advances in Neural Information Processing Systems 15, pages 569–576. MIT Press, 2003.
106.
Zurück zum Zitat U. von Luxburg, M. Belkin, and O. Bosquet. Consistency of spectral clustering. Technical report, Max Planck Institut für Biologische Kybernetik, 2004. U. von Luxburg, M. Belkin, and O. Bosquet. Consistency of spectral clustering. Technical report, Max Planck Institut für Biologische Kybernetik, 2004.
107.
Zurück zum Zitat U. von Luxburg, M. Belkin, and O. Bosquet. Limits of spectral clustering. In Advances in Neural Information Processing Systems 17. MIT Press, 2005. U. von Luxburg, M. Belkin, and O. Bosquet. Limits of spectral clustering. In Advances in Neural Information Processing Systems 17. MIT Press, 2005.
108.
Zurück zum Zitat D. Wagner and F. Wagner. Between min cut and graph bisection. In Mathematical Foundations of Kernel Methods, pages 744–750, 1993. D. Wagner and F. Wagner. Between min cut and graph bisection. In Mathematical Foundations of Kernel Methods, pages 744–750, 1993.
109.
Zurück zum Zitat G. Wahba. Spline Models for Observational Data. SIAM, 1990. G. Wahba. Spline Models for Observational Data. SIAM, 1990.
110.
Zurück zum Zitat J. Weston, A. Gammerman, M. Stitson, V. Vapnik, V. Vovk, and C. Watkins. Support vector density estimation. In Advances in Kernel Methods, pages 293–306. MIT Press, 1999. J. Weston, A. Gammerman, M. Stitson, V. Vapnik, V. Vovk, and C. Watkins. Support vector density estimation. In Advances in Kernel Methods, pages 293–306. MIT Press, 1999.
111.
Zurück zum Zitat J. Weston and C. Watkins. Multi-class support vector machines. In Proceedings of ESANN99, pages 219–224. D. Facto Press, 1999. J. Weston and C. Watkins. Multi-class support vector machines. In Proceedings of ESANN99, pages 219–224. D. Facto Press, 1999.
112.
Zurück zum Zitat C.K.I. Williams and D. Barber. Bayesian classification with Gaussian processes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(12):1342–1351, 1998. C.K.I. Williams and D. Barber. Bayesian classification with Gaussian processes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(12):1342–1351, 1998.
113.
Zurück zum Zitat W.H. Wolberg and O. Mangasarian. Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proceedings of the National Academy of Sciences, U.S.A., 87:9193–9196, 1990. W.H. Wolberg and O. Mangasarian. Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proceedings of the National Academy of Sciences, U.S.A., 87:9193–9196, 1990.
114.
Zurück zum Zitat Z.D. Wu, W.X. Xie, and J.P. Yu. Fuzzy c-means clustering algorithm based on kernel method. In Proceedings of the Fifth International Conference on Computational Intelligence and Multimedia Applications, ICCIMA 2003, pages 49–54. IEEE, 2003. Z.D. Wu, W.X. Xie, and J.P. Yu. Fuzzy c-means clustering algorithm based on kernel method. In Proceedings of the Fifth International Conference on Computational Intelligence and Multimedia Applications, ICCIMA 2003, pages 49–54. IEEE, 2003.
115.
Zurück zum Zitat J. Yang, V. Estvill-Castro, and S.K. Chalup. Support vector clustering through proximity graph modelling. In Neural Information Processing 2002, ICONIP’02, pages 898–903, 2002. J. Yang, V. Estvill-Castro, and S.K. Chalup. Support vector clustering through proximity graph modelling. In Neural Information Processing 2002, ICONIP’02, pages 898–903, 2002.
116.
Zurück zum Zitat S.X. Yu and J. Shi. Multiclass spectral clustering. In ICCV’03: Proceedings of the Ninth IEEE Conference on Computer Vision. IEEE Computer Society, 2003. S.X. Yu and J. Shi. Multiclass spectral clustering. In ICCV’03: Proceedings of the Ninth IEEE Conference on Computer Vision. IEEE Computer Society, 2003.
117.
Zurück zum Zitat D.-Q. Zhang and S.-C. Chen. Fuzzy clustering using kernel method. In The 2002 International Conference on Control and Automation, pages 162–163, 2002. D.-Q. Zhang and S.-C. Chen. Fuzzy clustering using kernel method. In The 2002 International Conference on Control and Automation, pages 162–163, 2002.
118.
Zurück zum Zitat D.-Q. Zhang and S.-C. Chen. Kernel based fuzzy and possibilistic c-means clustering. In Proceedings of the Fifth International Conference on Artificial Neural Networks, ICANN 2003, pages 122–125, 2003. D.-Q. Zhang and S.-C. Chen. Kernel based fuzzy and possibilistic c-means clustering. In Proceedings of the Fifth International Conference on Artificial Neural Networks, ICANN 2003, pages 122–125, 2003.
119.
Zurück zum Zitat D.-Q. Zhang and S.-C. Chen. A novel kernelized fuzzy c-means algorithms with applications in image segmentation. Artificial Intelligence in Medicine, 32(1):37–50, 2004. D.-Q. Zhang and S.-C. Chen. A novel kernelized fuzzy c-means algorithms with applications in image segmentation. Artificial Intelligence in Medicine, 32(1):37–50, 2004.
Metadaten
Titel
Kernel Methods
verfasst von
Francesco Camastra
Alessandro Vinciarelli
Copyright-Jahr
2015
Verlag
Springer London
DOI
https://doi.org/10.1007/978-1-4471-6735-8_9

Premium Partner