Skip to main content
Top

06-07-2024

A New Kernel Density Estimation-Based Entropic Isometric Feature Mapping for Unsupervised Metric Learning

Authors: Alaor Cervati Neto, Alexandre Luís Magalhães Levada, Michel Ferreira Cardia Haddad

Published in: Annals of Data Science

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Metric learning consists of designing adaptive distance functions that are well-suited to a specific dataset. Such tailored distance functions aim to deliver superior results compared to standard distance measures while performing machine learning tasks. In particular, the widely adopted Euclidean distance may be severely influenced due to noisy data and outliers, leading to suboptimal performance. In the present work, it is introduced a nonparametric isometric feature mapping (ISOMAP) method. The new algorithm is based on the kernel density estimation, exploring the relative entropy between probability density functions calculated in patches of the neighbourhood graph. The entropic neighbourhood network is built, where edges are weighted by a function of the relative entropies of the neighbouring patches instead of the Euclidean distance. A variety of datasets is considered in the analysis. The results indicate a superior performance compared to cutting edge manifold learning algorithms, such as the ISOMAP, unified manifold approximation and projection, and t-distributed stochastic neighbour embedding (t-SNE).

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
2.
go back to reference Olson DL, Shi Y, Shi Y (2007) Introduction to business data mining, vol 10. McGraw-Hill/Irwin, New York Olson DL, Shi Y, Shi Y (2007) Introduction to business data mining, vol 10. McGraw-Hill/Irwin, New York
3.
go back to reference Shi Y, Tian Y, Kou G, Peng Y, Li J (2011) Optimization based data mining: theory and applications. Springer, LondonCrossRef Shi Y, Tian Y, Kou G, Peng Y, Li J (2011) Optimization based data mining: theory and applications. Springer, LondonCrossRef
4.
go back to reference Tien JM (2017) Internet of things, real-time decision making, and artificial intelligence. Ann Data Sci 4:149–178CrossRef Tien JM (2017) Internet of things, real-time decision making, and artificial intelligence. Ann Data Sci 4:149–178CrossRef
5.
go back to reference Van Der Maaten L, Postma EO, Herik HJ (2009) Dimensionality reduction: a comparative review. J Mach Learn Res 10(66–71):1–41 Van Der Maaten L, Postma EO, Herik HJ (2009) Dimensionality reduction: a comparative review. J Mach Learn Res 10(66–71):1–41
6.
go back to reference Fukunaga K (2013) Introduction to statistical pattern recognition. Elsevier, Amsterdam Fukunaga K (2013) Introduction to statistical pattern recognition. Elsevier, Amsterdam
7.
go back to reference Wang F, Sun J (2015) Survey on distance metric learning and dimensionality reduction in data mining. Data Min Knowl Discov 29(2):534–564CrossRef Wang F, Sun J (2015) Survey on distance metric learning and dimensionality reduction in data mining. Data Min Knowl Discov 29(2):534–564CrossRef
8.
go back to reference Li D, Tian Y (2018) Survey and experimental study on metric learning methods. Neural Netw 105:447–462CrossRef Li D, Tian Y (2018) Survey and experimental study on metric learning methods. Neural Netw 105:447–462CrossRef
9.
go back to reference Wu W, Tao D, Li H, Yang Z, Cheng J (2021) Deep features for person re-identification on metric learning. Pattern Recognit 110:107424CrossRef Wu W, Tao D, Li H, Yang Z, Cheng J (2021) Deep features for person re-identification on metric learning. Pattern Recognit 110:107424CrossRef
10.
go back to reference Tenenbaum JB, Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323CrossRef Tenenbaum JB, Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323CrossRef
11.
go back to reference Cox T, Cox M (2000) Multidimensional Scaling (2nd ed.). Chapman and Hall/CRC Cox T, Cox M (2000) Multidimensional Scaling (2nd ed.). Chapman and Hall/CRC
12.
go back to reference Choi H, Choi S (2007) Robust kernel isomap. Pattern Recognit 40(3):853–862CrossRef Choi H, Choi S (2007) Robust kernel isomap. Pattern Recognit 40(3):853–862CrossRef
13.
go back to reference Shang F, Jiao LC, Shi J, Chai J (2011) Robust positive semidefinite L-isomap ensemble. Pattern Recognit Lett 32(4):640–649CrossRef Shang F, Jiao LC, Shi J, Chai J (2011) Robust positive semidefinite L-isomap ensemble. Pattern Recognit Lett 32(4):640–649CrossRef
14.
go back to reference Gan Q, Shen F, Zhao J (2014) An extended isomap for manifold topology learning with SOINN landmarks. In: 22nd international conference on pattern recognition (ICPR 2014), pp 1579–1584 Gan Q, Shen F, Zhao J (2014) An extended isomap for manifold topology learning with SOINN landmarks. In: 22nd international conference on pattern recognition (ICPR 2014), pp 1579–1584
15.
go back to reference Najafi A, Joudaki A, Fatemizadeh E (2016) Nonlinear dimensionality reduction via path-based isometric mapping. IEEE Trans Pattern Anal Mach Intell 38(7):1452–1464CrossRef Najafi A, Joudaki A, Fatemizadeh E (2016) Nonlinear dimensionality reduction via path-based isometric mapping. IEEE Trans Pattern Anal Mach Intell 38(7):1452–1464CrossRef
16.
go back to reference Gajamannage K, Paffenroth R, Bollt EM (2019) A nonlinear dimensionality reduction framework using smooth geodesics. Pattern Recognit 87:226–236CrossRef Gajamannage K, Paffenroth R, Bollt EM (2019) A nonlinear dimensionality reduction framework using smooth geodesics. Pattern Recognit 87:226–236CrossRef
17.
go back to reference Budninskiy M, Yin G, Feng L, Tong Y, Desbrun M (2019) Parallel transport unfolding: a connection-based manifold learning approach. SIAM J Appl Algebra Geom 3(2):266–291CrossRef Budninskiy M, Yin G, Feng L, Tong Y, Desbrun M (2019) Parallel transport unfolding: a connection-based manifold learning approach. SIAM J Appl Algebra Geom 3(2):266–291CrossRef
18.
go back to reference Shamai G, Zibulevsky M, Kimmel R (2020) Efficient inter-geodesic distance computation and fast classical scaling. IEEE Trans Pattern Anal Mach Intell 42(1):74–85CrossRef Shamai G, Zibulevsky M, Kimmel R (2020) Efficient inter-geodesic distance computation and fast classical scaling. IEEE Trans Pattern Anal Mach Intell 42(1):74–85CrossRef
19.
go back to reference Tasoulis S, Pavlidis NG, Roos T (2020) Nonlinear dimensionality reduction for clustering. Pattern Recognit 107:107508CrossRef Tasoulis S, Pavlidis NG, Roos T (2020) Nonlinear dimensionality reduction for clustering. Pattern Recognit 107:107508CrossRef
21.
22.
go back to reference Maaten LJP, Hinton GE (2008) Visualizing high-dimensional data using t-SNE. J Mach Learn Res 9:2579–2605 Maaten LJP, Hinton GE (2008) Visualizing high-dimensional data using t-SNE. J Mach Learn Res 9:2579–2605
23.
go back to reference Borg I, Groenen P (2005) Modern multidimensional scaling: theory and applications, 2nd edn. Springer, New York Borg I, Groenen P (2005) Modern multidimensional scaling: theory and applications, 2nd edn. Springer, New York
24.
go back to reference Rosenblatt M (1956) Remarks on some nonparametric estimates of a density function. Ann Math Stat 27(3):832–837CrossRef Rosenblatt M (1956) Remarks on some nonparametric estimates of a density function. Ann Math Stat 27(3):832–837CrossRef
25.
go back to reference Parzen E (1962) On estimation of a probability density function and mode. Ann Math Stat 33(3):1065–1076CrossRef Parzen E (1962) On estimation of a probability density function and mode. Ann Math Stat 33(3):1065–1076CrossRef
27.
go back to reference Silverman BW (1986) Density estimation for statistics and data analysis. Chapman & Hall/CRC, New York Silverman BW (1986) Density estimation for statistics and data analysis. Chapman & Hall/CRC, New York
28.
go back to reference Scott DW (1979) On optimal and data-based histograms. Biometrika 66(3):605–610CrossRef Scott DW (1979) On optimal and data-based histograms. Biometrika 66(3):605–610CrossRef
29.
go back to reference Levada ALM (2020) Parametric PCA for unsupervised metric learning. Pattern Recognit Lett 135:425–430CrossRef Levada ALM (2020) Parametric PCA for unsupervised metric learning. Pattern Recognit Lett 135:425–430CrossRef
30.
go back to reference Roweis S, Saul L (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326CrossRef Roweis S, Saul L (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326CrossRef
31.
go back to reference Ham J, Lee DD, Mika S, Schölkopf B (2004) A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the twenty-first international conference on machine learning (ICML ’04). Association for Computing Machinery, New York, NY, USA, p 47 Ham J, Lee DD, Mika S, Schölkopf B (2004) A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the twenty-first international conference on machine learning (ICML ’04). Association for Computing Machinery, New York, NY, USA, p 47
Metadata
Title
A New Kernel Density Estimation-Based Entropic Isometric Feature Mapping for Unsupervised Metric Learning
Authors
Alaor Cervati Neto
Alexandre Luís Magalhães Levada
Michel Ferreira Cardia Haddad
Publication date
06-07-2024
Publisher
Springer Berlin Heidelberg
Published in
Annals of Data Science
Print ISSN: 2198-5804
Electronic ISSN: 2198-5812
DOI
https://doi.org/10.1007/s40745-024-00548-x

Premium Partner