Skip to main content
Top

2015 | OriginalPaper | Chapter

IKLTSA: An Incremental Kernel LTSA Method

Authors : Chao Tan, Jihong Guan, Shuigeng Zhou

Published in: Machine Learning and Data Mining in Pattern Recognition

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Since 2000, manifold learning methods have been extensively studied, and demonstrated excellent performance in dimensionality reduction in some application scenarios. However, they still have some drawbacks in approximating real nonlinear relationships during the dimensionality reduction process, thus are unable to retain the original data’s structure well. In this paper, we propose an incremental version of the manifold learning algorithm LTSA based on kernel method, which is called IKLSTA, the abbreviation of Incremental Kernel LTSA. IKLTSA exploits the advantages of kernel method and can detect the explicit mapping from the high-dimensional data points to their low-dimensional embedding coordinates. It is also able to reflect the intrinsic structure of the original high dimensional data more exactly and deal with new data points incrementally. Extensive experiments on both synthetic and real-world data sets validate the effectiveness of the proposed method.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)MATHCrossRef Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)MATHCrossRef
2.
3.
go back to reference Chen, M., Li, W., Zhang, W., Wang, X.G.: Dimensionality reduction with generalized linear models. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1267–1272 (2013) Chen, M., Li, W., Zhang, W., Wang, X.G.: Dimensionality reduction with generalized linear models. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1267–1272 (2013)
4.
go back to reference Ham, J., Lee, D., Mika, S., Scholkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: Proceedings of International Conference on Machine Learning, pp. 47–54 (2004) Ham, J., Lee, D., Mika, S., Scholkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: Proceedings of International Conference on Machine Learning, pp. 47–54 (2004)
5.
go back to reference He, X.F., Niyogi, P.: Locality preserving projections. In: Proceedings of the Neural Information Processing Systems, pp. 153–160 (2003) He, X.F., Niyogi, P.: Locality preserving projections. In: Proceedings of the Neural Information Processing Systems, pp. 153–160 (2003)
6.
go back to reference He, X.F., Cai, D., Yan, S.C., Zhang, H.J.: Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision, pp. 1208–1213 (2005) He, X.F., Cai, D., Yan, S.C., Zhang, H.J.: Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision, pp. 1208–1213 (2005)
7.
go back to reference Kokiopoulou, E., Saad, Y.: Orthogonal neighborhood preserving projections. In: Proceedings of the 5th IEEE International Conference on Data Mining, pp. 1–7 (2005) Kokiopoulou, E., Saad, Y.: Orthogonal neighborhood preserving projections. In: Proceedings of the 5th IEEE International Conference on Data Mining, pp. 1–7 (2005)
8.
go back to reference Langone, R., Agudelo, O., Moor, B., Suykens, J.: Incremental kernel spectral clustering for online learning of non-stationary data. Neurocomputing 139(2), 246–260 (2014)CrossRef Langone, R., Agudelo, O., Moor, B., Suykens, J.: Incremental kernel spectral clustering for online learning of non-stationary data. Neurocomputing 139(2), 246–260 (2014)CrossRef
9.
go back to reference Li, H., Jiang, H., et al.: Incremental manifold learning by spectral embedding methods. Pattern Recogn. Lett. 32, 1447–1455 (2011)CrossRef Li, H., Jiang, H., et al.: Incremental manifold learning by spectral embedding methods. Pattern Recogn. Lett. 32, 1447–1455 (2011)CrossRef
10.
go back to reference Liu, S.L., Yan, D.Q.: A new global embedding algorithm. Acta AUTOMATICA Sinica 37(7), 828–835 (2011)MathSciNet Liu, S.L., Yan, D.Q.: A new global embedding algorithm. Acta AUTOMATICA Sinica 37(7), 828–835 (2011)MathSciNet
11.
go back to reference Li, L., Zhang, Y.J.: Linear projection-based non-negative matrix factorization. Acta Automatica Sinica 36(1), 23–39 (2010)CrossRef Li, L., Zhang, Y.J.: Linear projection-based non-negative matrix factorization. Acta Automatica Sinica 36(1), 23–39 (2010)CrossRef
12.
go back to reference Pang, Y., Zhang, L., Liu, Z., Yu, N., Li, H.: Neighborhood Preserving Projections (NPP): a novel linear dimension reduction method. In: Huang, D.-S., Zhang, X.-P., Huang, G.-B. (eds.) ICIC 2005. LNCS, vol. 3644, pp. 117–125. Springer, Heidelberg (2005) CrossRef Pang, Y., Zhang, L., Liu, Z., Yu, N., Li, H.: Neighborhood Preserving Projections (NPP): a novel linear dimension reduction method. In: Huang, D.-S., Zhang, X.-P., Huang, G.-B. (eds.) ICIC 2005. LNCS, vol. 3644, pp. 117–125. Springer, Heidelberg (2005) CrossRef
13.
go back to reference Qiao, H., Zhang, P., Wang, D., Zhang, B.: An explicit nonlinear mapping for manifold learning. IEEE Trans. Cybern. 43(1), 51–63 (2013)CrossRef Qiao, H., Zhang, P., Wang, D., Zhang, B.: An explicit nonlinear mapping for manifold learning. IEEE Trans. Cybern. 43(1), 51–63 (2013)CrossRef
14.
go back to reference Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRef Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRef
15.
go back to reference Saul, L., Roweis, S.: Think globally, fit locally: Unsupervised learning of nonlinear manifolds. J. Mach. Learn. Res. 4, 119–155 (2003)MathSciNet Saul, L., Roweis, S.: Think globally, fit locally: Unsupervised learning of nonlinear manifolds. J. Mach. Learn. Res. 4, 119–155 (2003)MathSciNet
16.
go back to reference Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)CrossRef Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)CrossRef
17.
go back to reference Tan, C., Chen, C., Guan, J.: A nonlinear dimension reduction method with both distance and neighborhood preservation. In: Wang, M. (ed.) KSEM 2013. LNCS, vol. 8041, pp. 48–63. Springer, Heidelberg (2013) CrossRef Tan, C., Chen, C., Guan, J.: A nonlinear dimension reduction method with both distance and neighborhood preservation. In: Wang, M. (ed.) KSEM 2013. LNCS, vol. 8041, pp. 48–63. Springer, Heidelberg (2013) CrossRef
18.
go back to reference Zhang, Z.Y., Zha, H.Y.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Sci. Comput. 26(1), 313–338 (2005)MathSciNetCrossRef Zhang, Z.Y., Zha, H.Y.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Sci. Comput. 26(1), 313–338 (2005)MathSciNetCrossRef
19.
go back to reference Zhang, Z.Y., Wang, J., Zha, H.Y.: Adaptive manifold learning. IEEE Trans. Pattern Anal. Mach. Intell. 34(2), 253–265 (2012)CrossRef Zhang, Z.Y., Wang, J., Zha, H.Y.: Adaptive manifold learning. IEEE Trans. Pattern Anal. Mach. Intell. 34(2), 253–265 (2012)CrossRef
20.
go back to reference Zheng, S.W., Qiao, H., Zhang, B., Zhang, P.: The application of intrinsic variable preserving manifold learning method to tracking multiple people with occlusion reasoning. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993–2998 (2009) Zheng, S.W., Qiao, H., Zhang, B., Zhang, P.: The application of intrinsic variable preserving manifold learning method to tracking multiple people with occlusion reasoning. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2993–2998 (2009)
Metadata
Title
IKLTSA: An Incremental Kernel LTSA Method
Authors
Chao Tan
Jihong Guan
Shuigeng Zhou
Copyright Year
2015
DOI
https://doi.org/10.1007/978-3-319-21024-7_5

Premium Partner