Skip to main content
Erschienen in: Neural Computing and Applications 8/2019

16.11.2017 | Original Article

Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems

verfasst von: Mahdi Heidari, Mohammad Hossein Moattar

Erschienen in: Neural Computing and Applications | Ausgabe 8/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Dimension reduction is a common approach for analyzing complex high-dimensional data and allows efficient implementation of classification and decision algorithms. Gaussian process latent variable model (GPLVM) is a widely applicable dimension reduction method which represents latent space without considering the class labels. Preserving the structure and topology of data are key factors that influence the performance of dimensionality reduction models. A conventional measure which reflects the topological structure of data points is geodesic distance. In this study, we propose an enriched GPLVM mapping between low-dimensional space and high-dimensional data. One of the contributions of the proposed approach is to calculate geodesic distance under the influence of class labels and introducing an improved GPLVM kernel using the distance. Also, the objective function of the model is reformulated to consider the trade-off between class separation and structure preservation which improves discrimination power and compactness of data. The efficiency of the proposed approach is compared with other dimension reduction techniques such as the kernel principal component analysis (KPCA), locally linear embedding (LLE), Laplacian eigenmaps and also discriminative and supervised extensions of standard GPLVM. Based on the experiments, it is suggested that the proposed model has a higher capacity for accurate classification and clustering of data as compared with the mentioned approaches.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef
2.
Zurück zum Zitat Pudil P, Novovičová J (1998) Novel methods for feature subset selection with respect to problem knowledge. Feature extraction, construction and selection. Springer, Berlin, pp 101–116CrossRef Pudil P, Novovičová J (1998) Novel methods for feature subset selection with respect to problem knowledge. Feature extraction, construction and selection. Springer, Berlin, pp 101–116CrossRef
3.
4.
Zurück zum Zitat Agapito L, Bronstein MM, Rother C (2015) Computer vision—ECCV 2014 workshops, Zurich, Switzerland, September 6–7 and 12, 2014, Proceedings, no. pt. 4. Springer Agapito L, Bronstein MM, Rother C (2015) Computer vision—ECCV 2014 workshops, Zurich, Switzerland, September 6–7 and 12, 2014, Proceedings, no. pt. 4. Springer
5.
Zurück zum Zitat Schölkopf B, Smola A, Müller K-R (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10(5):1299–1319CrossRef Schölkopf B, Smola A, Müller K-R (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10(5):1299–1319CrossRef
6.
Zurück zum Zitat Belkin M, Niyogi P (2001) Laplacian eigenmaps and spectral techniques for embedding and clustering. NIPS 14:585–591 Belkin M, Niyogi P (2001) Laplacian eigenmaps and spectral techniques for embedding and clustering. NIPS 14:585–591
7.
Zurück zum Zitat Li H, Teng L, Chen W, Shen IF (2005) Supervised learning on local tangent space. In: Wang J, Liao X, Yi Z (eds) Advances in neural networks—ISNN 2005, vol 3496. Springer, Berlin, pp 546–551CrossRef Li H, Teng L, Chen W, Shen IF (2005) Supervised learning on local tangent space. In: Wang J, Liao X, Yi Z (eds) Advances in neural networks—ISNN 2005, vol 3496. Springer, Berlin, pp 546–551CrossRef
8.
Zurück zum Zitat Lawrence ND (2006) The Gaussian process latent variable model. The University of Sheffield, Department of Computer Science, Technical Report Lawrence ND (2006) The Gaussian process latent variable model. The University of Sheffield, Department of Computer Science, Technical Report
9.
Zurück zum Zitat Jiang X, Gao J, Wang T, Zheng L (2012) Supervised latent linear Gaussian process latent variable model for dimensionality reduction. IEEE Trans Syst Man Cybern Part B Cybern 42(6):1620–1632CrossRef Jiang X, Gao J, Wang T, Zheng L (2012) Supervised latent linear Gaussian process latent variable model for dimensionality reduction. IEEE Trans Syst Man Cybern Part B Cybern 42(6):1620–1632CrossRef
10.
Zurück zum Zitat Gao X, Wang X, Tao D, Li X (2011) Supervised Gaussian process latent variable model for dimensionality reduction. IEEE Trans Syst Man Cybern Part B Cybern 41(2):425–434CrossRef Gao X, Wang X, Tao D, Li X (2011) Supervised Gaussian process latent variable model for dimensionality reduction. IEEE Trans Syst Man Cybern Part B Cybern 41(2):425–434CrossRef
11.
Zurück zum Zitat Romero J, Feix T, Ek CH, Kjellstrom H, Kragic D (2013) Extracting postural synergies for robotic grasping. IEEE Trans Robot 29(6):1342–1352CrossRef Romero J, Feix T, Ek CH, Kjellstrom H, Kragic D (2013) Extracting postural synergies for robotic grasping. IEEE Trans Robot 29(6):1342–1352CrossRef
12.
Zurück zum Zitat Han L, Wu X, Liang W, Hou G, Jia Y (2010) Discriminative human action recognition in the learned hierarchical manifold space. Image Vis Comput 28(5):836–849CrossRef Han L, Wu X, Liang W, Hou G, Jia Y (2010) Discriminative human action recognition in the learned hierarchical manifold space. Image Vis Comput 28(5):836–849CrossRef
13.
Zurück zum Zitat Serradilla J, Shi JQ, Morris AJ (2011) Fault detection based on Gaussian process latent variable models. Chemom Intell Lab Syst 109(1):9–21CrossRef Serradilla J, Shi JQ, Morris AJ (2011) Fault detection based on Gaussian process latent variable models. Chemom Intell Lab Syst 109(1):9–21CrossRef
14.
Zurück zum Zitat Lawrence N (2005) Probabilistic non-linear principal component analysis with Gaussian process latent variable models. J Mach Learn Res 6:1783–1816MathSciNetMATH Lawrence N (2005) Probabilistic non-linear principal component analysis with Gaussian process latent variable models. J Mach Learn Res 6:1783–1816MathSciNetMATH
15.
Zurück zum Zitat Urtasun R, Darrell T (2007) Discriminative Gaussian process latent variable model for classification. In: Proceedings of the 24th international conference on machine learning. ACM, Corvalis, Oregon, pp 927–934 Urtasun R, Darrell T (2007) Discriminative Gaussian process latent variable model for classification. In: Proceedings of the 24th international conference on machine learning. ACM, Corvalis, Oregon, pp 927–934
16.
Zurück zum Zitat Ek CH, Jaeckel P, Campbell N, Lawrence ND, Melhuish C (2009) Shared Gaussian process latent variable models for handling ambiguous facial expressions. Am Inst Phys Conf Ser 1107:147–153 Ek CH, Jaeckel P, Campbell N, Lawrence ND, Melhuish C (2009) Shared Gaussian process latent variable models for handling ambiguous facial expressions. Am Inst Phys Conf Ser 1107:147–153
17.
Zurück zum Zitat Ek CH, Lawrence PHSTND (2009) Shared Gaussian process latent variable models. Ph.D. thesis Ek CH, Lawrence PHSTND (2009) Shared Gaussian process latent variable models. Ph.D. thesis
18.
Zurück zum Zitat Wang X, Gao X, Yuan Y, Tao D, Li J (2010) Semi-supervised Gaussian process latent variable model with pairwise constraints. Neurocomputing 73(10):2186–2195CrossRef Wang X, Gao X, Yuan Y, Tao D, Li J (2010) Semi-supervised Gaussian process latent variable model with pairwise constraints. Neurocomputing 73(10):2186–2195CrossRef
19.
Zurück zum Zitat Hensman J, Fusi N, Lawrence ND (2013) Gaussian processes for big data. CoRR, vol. abs/1309.6835 Hensman J, Fusi N, Lawrence ND (2013) Gaussian processes for big data. CoRR, vol. abs/1309.6835
20.
Zurück zum Zitat van der Maaten LJP, Postma EO, van den Herik HJ (2009) Dimensionality reduction: a comparative review. J Mach Learn Res 10(1–41):66–71 van der Maaten LJP, Postma EO, van den Herik HJ (2009) Dimensionality reduction: a comparative review. J Mach Learn Res 10(1–41):66–71
21.
Zurück zum Zitat Chun-Guang L, Jun G (2006) Supervised Isomap with explicit mapping. In: First international conference on innovative computing, information and control, 2006. ICICIC’06, vol 3, pp 345–348 Chun-Guang L, Jun G (2006) Supervised Isomap with explicit mapping. In: First international conference on innovative computing, information and control, 2006. ICICIC’06, vol 3, pp 345–348
22.
Zurück zum Zitat Kaski S, Peltonen J (2011) Dimensionality reduction for data visualization [applications corner]. IEEE Signal Process Mag 28(2):100–104CrossRef Kaski S, Peltonen J (2011) Dimensionality reduction for data visualization [applications corner]. IEEE Signal Process Mag 28(2):100–104CrossRef
23.
Zurück zum Zitat Venna J, Kaski S (2007) Nonlinear dimensionality reduction as information retrieval. In: Artificial intelligence and statistics, pp 572–579 Venna J, Kaski S (2007) Nonlinear dimensionality reduction as information retrieval. In: Artificial intelligence and statistics, pp 572–579
24.
Zurück zum Zitat Gorban AN, Kégl B, Wunsch DC, Zinovyev A (2008) Principal manifolds for data visualization and dimension reduction, vol 58. Springer, BerlinMATHCrossRef Gorban AN, Kégl B, Wunsch DC, Zinovyev A (2008) Principal manifolds for data visualization and dimension reduction, vol 58. Springer, BerlinMATHCrossRef
25.
Zurück zum Zitat Kaski S, Nikkilä J, Oja M, Venna J, Törönen P, Castrén E (2003) Trustworthiness and metrics in visualizing similarity of gene expression. BMC Bioinform 4(1):48CrossRef Kaski S, Nikkilä J, Oja M, Venna J, Törönen P, Castrén E (2003) Trustworthiness and metrics in visualizing similarity of gene expression. BMC Bioinform 4(1):48CrossRef
26.
Zurück zum Zitat Chen J, Liu Y (2011) Locally linear embedding: a survey. Artif Intell Rev 36(1):29–48CrossRef Chen J, Liu Y (2011) Locally linear embedding: a survey. Artif Intell Rev 36(1):29–48CrossRef
27.
Zurück zum Zitat Kovács F, Legány C, Babos A (2005) Cluster validity measurement techniques. In: 6th International symposium of hungarian researchers on computational intelligence Kovács F, Legány C, Babos A (2005) Cluster validity measurement techniques. In: 6th International symposium of hungarian researchers on computational intelligence
28.
Zurück zum Zitat Wagner S, Wagner D (2007) Comparing clusterings: an overview. Universität Karlsruhe, Fakultät für Informatik, Karlsruhe Wagner S, Wagner D (2007) Comparing clusterings: an overview. Universität Karlsruhe, Fakultät für Informatik, Karlsruhe
29.
Zurück zum Zitat Rand WM (1971) Objective criteria for the evaluation of clustering methods. J Am Stat Assoc 66(336):846–850CrossRef Rand WM (1971) Objective criteria for the evaluation of clustering methods. J Am Stat Assoc 66(336):846–850CrossRef
32.
Zurück zum Zitat Fowlkes EB, Mallows CL (1983) A method for comparing two hierarchical clusterings. J Am Stat Assoc 78(383):553–569MATHCrossRef Fowlkes EB, Mallows CL (1983) A method for comparing two hierarchical clusterings. J Am Stat Assoc 78(383):553–569MATHCrossRef
33.
Zurück zum Zitat Lingras P, Haider F, Triff M (2016) Granular meta-clustering based on hierarchical, network, and temporal connections. Granul Comput 1(1):71–92CrossRef Lingras P, Haider F, Triff M (2016) Granular meta-clustering based on hierarchical, network, and temporal connections. Granul Comput 1(1):71–92CrossRef
34.
Zurück zum Zitat Peters G, Weber R (2016) DCC: a framework for dynamic granular clustering. Granul Comput 1(1):1–11CrossRef Peters G, Weber R (2016) DCC: a framework for dynamic granular clustering. Granul Comput 1(1):1–11CrossRef
35.
Zurück zum Zitat Antonelli M, Ducange P, Lazzerini B, Marcelloni F (2016) Multi-objective evolutionary design of granular rule-based classifiers. Granul Comput 1(1):37–58CrossRef Antonelli M, Ducange P, Lazzerini B, Marcelloni F (2016) Multi-objective evolutionary design of granular rule-based classifiers. Granul Comput 1(1):37–58CrossRef
36.
Zurück zum Zitat Liu H, Cocea M (2017) Granular computing-based approach for classification towards reduction of bias in ensemble learning. Granul Comput 2(3):131–139CrossRef Liu H, Cocea M (2017) Granular computing-based approach for classification towards reduction of bias in ensemble learning. Granul Comput 2(3):131–139CrossRef
37.
Zurück zum Zitat Livi L, Sadeghian A (2016) Granular computing, computational intelligence, and the analysis of non-geometric input spaces. Granul Comput 1(1):13–20CrossRef Livi L, Sadeghian A (2016) Granular computing, computational intelligence, and the analysis of non-geometric input spaces. Granul Comput 1(1):13–20CrossRef
Metadaten
Titel
Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems
verfasst von
Mahdi Heidari
Mohammad Hossein Moattar
Publikationsdatum
16.11.2017
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 8/2019
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-017-3273-4

Weitere Artikel der Ausgabe 8/2019

Neural Computing and Applications 8/2019 Zur Ausgabe