Skip to main content
Top
Published in: Neural Computing and Applications 3/2018

29-11-2016 | Original Article

Classification and dimensional reduction using restricted radial basis function networks

Author: Pitoyo Hartono

Published in: Neural Computing and Applications | Issue 3/2018

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

One of the most efficient means to understand complex data is by visualizing them in two- or three-dimensional space. As meaningful data are likely to be high dimensional, visualizing them requires dimensional reduction algorithms, which objective is to map high-dimensional data into low-dimensional space while preserving some of their underlying structures. For labeled data, their low-dimensional representations should embed their classifiability so that their class-structures become visible. It is also beneficial if an algorithm can classify labeled input while at the same time executes dimensional reduction to visually offer information regarding the data’s structure to give rational behind the classification. However, most of the currently available dimensional reduction methods are not usually equipped with classification features, while most classification algorithm lacks transparencies in rationalizing their decisions. In this paper, the restricted radial basis function networks (rRBF), a recently proposed supervised neural network with low-dimensional internal representation, is utilized for visualizing high-dimensional data while also performing classification. The primary focus of this paper is to empirically explain the classifiability and visual transparency of the rRBF.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Bunte K, Hammer B, Wismuler A, Biehl M (2010) Adaptive local disimilarity measures for discriminative dimension reduction of labelled data. Neurocomputing 73:1074–1092CrossRef Bunte K, Hammer B, Wismuler A, Biehl M (2010) Adaptive local disimilarity measures for discriminative dimension reduction of labelled data. Neurocomputing 73:1074–1092CrossRef
2.
go back to reference Bunte K, Bieh M, Hammer B (2011) A general framework for dimensionality reducing data visualization mapping. Neural Comput 24:771–804CrossRefMATH Bunte K, Bieh M, Hammer B (2011) A general framework for dimensionality reducing data visualization mapping. Neural Comput 24:771–804CrossRefMATH
3.
go back to reference Cover T, Hart P (1967) Nearest neighbor pattern classifications. IEEE Trans Inf Theory 13:21–27CrossRefMATH Cover T, Hart P (1967) Nearest neighbor pattern classifications. IEEE Trans Inf Theory 13:21–27CrossRefMATH
4.
go back to reference Fisher R (1936) The use of multiple measurements in taxonomic problems. Ann Eugen 7(2):179–188CrossRef Fisher R (1936) The use of multiple measurements in taxonomic problems. Ann Eugen 7(2):179–188CrossRef
5.
go back to reference Flexer A (2001) On the use of self-organizing maps for clustering and visualization. Intell Data Anal 5(5):373–384MATH Flexer A (2001) On the use of self-organizing maps for clustering and visualization. Intell Data Anal 5(5):373–384MATH
6.
go back to reference Geng X, Zhan D-C, Zhou Z-H (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B 35(6):1098–1107CrossRef Geng X, Zhan D-C, Zhou Z-H (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B 35(6):1098–1107CrossRef
7.
go back to reference Goldberger J, Roweis S, Hinton G, Salakhutdinov R (2004) Neighborhood components analysis. In: Saul LK, Weiss Y, Bottou L (eds) Advances in neural information processing systems, vol 17. MIT Press, Cambridge, MA, pp 513–520 Goldberger J, Roweis S, Hinton G, Salakhutdinov R (2004) Neighborhood components analysis. In: Saul LK, Weiss Y, Bottou L (eds) Advances in neural information processing systems, vol 17. MIT Press, Cambridge, MA, pp 513–520
8.
go back to reference Gonen M (2014) Coupled dimensionality reduction and classification for supervised and semi-supervised multilabel learning. Pattern Recognit Lett 38:132–141CrossRef Gonen M (2014) Coupled dimensionality reduction and classification for supervised and semi-supervised multilabel learning. Pattern Recognit Lett 38:132–141CrossRef
9.
go back to reference Hartono P, Trappenberg T (2013) Classificability-regulated self-organizing map using restricted RBF. In: Proc. IEEE international joint conference on neural networks (IJCNN 2013). pp 160–164 Hartono P, Trappenberg T (2013) Classificability-regulated self-organizing map using restricted RBF. In: Proc. IEEE international joint conference on neural networks (IJCNN 2013). pp 160–164
10.
go back to reference Hartono P, Hollensen P, Trappenberg T (2015) Learning-regulated context relevant topographical map. IEEE Trans Neural Netw Learn Syst 26(10):2323–2335MathSciNetCrossRef Hartono P, Hollensen P, Trappenberg T (2015) Learning-regulated context relevant topographical map. IEEE Trans Neural Netw Learn Syst 26(10):2323–2335MathSciNetCrossRef
11.
go back to reference Hinton G (2007) Learning multiple layers of representation. Trends Cognit Sci 11(10):428–434CrossRef Hinton G (2007) Learning multiple layers of representation. Trends Cognit Sci 11(10):428–434CrossRef
12.
go back to reference Hinton G, Roweis S (2002) Stochastic neighbor embedding. Adv Neural Inf Process Syst 15:833–840 Hinton G, Roweis S (2002) Stochastic neighbor embedding. Adv Neural Inf Process Syst 15:833–840
16.
17.
go back to reference Martinez A, Kak A (2001) Pca versus lda. IEEE Trans Pattern Anal Mach Intell 23(2):228–233CrossRef Martinez A, Kak A (2001) Pca versus lda. IEEE Trans Pattern Anal Mach Intell 23(2):228–233CrossRef
18.
go back to reference Matsunaga R, Hartono P, Abe J (2013) Learning of tonality differentiations between western music and traditional japanese music by an artificial neural network: an approach of restricted RBF (in Japanese). Technical report Matsunaga R, Hartono P, Abe J (2013) Learning of tonality differentiations between western music and traditional japanese music by an artificial neural network: an approach of restricted RBF (in Japanese). Technical report
19.
go back to reference Memisevic R, Hinton G (2005) Multiple relational embedding. In: Proc. NIPS 2004. MIT Press, pp 913–920 Memisevic R, Hinton G (2005) Multiple relational embedding. In: Proc. NIPS 2004. MIT Press, pp 913–920
21.
go back to reference Peltonen J, Klami A, Kaski S (2004) Improved learning of riemannian metrics for exploratory analysis. Neural Netw 17:1087–1100CrossRefMATH Peltonen J, Klami A, Kaski S (2004) Improved learning of riemannian metrics for exploratory analysis. Neural Netw 17:1087–1100CrossRefMATH
22.
go back to reference Peltonen J, Aido H, Kaski S (2009) Supervised nonlinear dimensionality reduction by neighborhood retrieval. In: Proc. of IEEE ICASP 2009. pp 1809–1812 Peltonen J, Aido H, Kaski S (2009) Supervised nonlinear dimensionality reduction by neighborhood retrieval. In: Proc. of IEEE ICASP 2009. pp 1809–1812
23.
go back to reference Poggio T, Girosi F (1990) Networks for approximation and learning. Proc IEEE 87:1484–1487MATH Poggio T, Girosi F (1990) Networks for approximation and learning. Proc IEEE 87:1484–1487MATH
24.
go back to reference Roweis S, Saul L (2000) Dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef Roweis S, Saul L (2000) Dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef
25.
go back to reference Rumelhart D, Hinton G, Williams R (1984) Learning internal representations by error propagation. In: Rumelhart D, McClelland J (eds) Parallel distributed processing. MIT Press, Cambridge, MA, pp 318–362 Rumelhart D, Hinton G, Williams R (1984) Learning internal representations by error propagation. In: Rumelhart D, McClelland J (eds) Parallel distributed processing. MIT Press, Cambridge, MA, pp 318–362
26.
go back to reference Rumelhart D, Hinton G, William R (1986) Learning representation by backpropagating errors. Nature 323:533–536CrossRef Rumelhart D, Hinton G, William R (1986) Learning representation by backpropagating errors. Nature 323:533–536CrossRef
27.
go back to reference Schulz A, Gisbrecht A, Hammer B (2013) Using nonlinear dimensionality reduction to visualize classifiers. In: Rojas I, Joya G, Cabestany J (eds) IWANN 2013 part 1, LNCS 7902. pp 59–68 Schulz A, Gisbrecht A, Hammer B (2013) Using nonlinear dimensionality reduction to visualize classifiers. In: Rojas I, Joya G, Cabestany J (eds) IWANN 2013 part 1, LNCS 7902. pp 59–68
29.
go back to reference van der Maaten LPJ (2008) Visualizing high-dimensional data using t-SNE. J Mach Learn Res 9:2579–2605MATH van der Maaten LPJ (2008) Visualizing high-dimensional data using t-SNE. J Mach Learn Res 9:2579–2605MATH
30.
go back to reference van der Maaten LPJ, Postma EO, van den Herik HJ (2009) Dimensionality reduction: a comparative review. Technical report TiCC-TR 2009-005, Tilburg University van der Maaten LPJ, Postma EO, van den Herik HJ (2009) Dimensionality reduction: a comparative review. Technical report TiCC-TR 2009-005, Tilburg University
31.
go back to reference Venna J, Peltonen J, Nybo K, Kaski S (2010) Information retrieval perspective to nonlinear dimensionality reduction for data visualization. J Mach Learn Res 11:451–490MathSciNetMATH Venna J, Peltonen J, Nybo K, Kaski S (2010) Information retrieval perspective to nonlinear dimensionality reduction for data visualization. J Mach Learn Res 11:451–490MathSciNetMATH
32.
go back to reference Wang W, Carreira-Perpinan M (2014) The role of dimensionality reduction in classification. In: Proc. Of the 28th AAAI conf. on artificial intelligence. pp 2128–2134 Wang W, Carreira-Perpinan M (2014) The role of dimensionality reduction in classification. In: Proc. Of the 28th AAAI conf. on artificial intelligence. pp 2128–2134
33.
go back to reference Weinberger K, Saul L (2006) Unsupervised learning of image manifolds by semidefinite programming. Int J Comput Vis 70:77–90CrossRef Weinberger K, Saul L (2006) Unsupervised learning of image manifolds by semidefinite programming. Int J Comput Vis 70:77–90CrossRef
34.
go back to reference Weinberger K, Saul L (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10:207–244MATH Weinberger K, Saul L (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10:207–244MATH
35.
go back to reference Xu C (2014) Large-margin weakly supervised dimensionality reduction. In: Proc. the 31st int. conf. on machine learning. pp 865–873 Xu C (2014) Large-margin weakly supervised dimensionality reduction. In: Proc. the 31st int. conf. on machine learning. pp 865–873
Metadata
Title
Classification and dimensional reduction using restricted radial basis function networks
Author
Pitoyo Hartono
Publication date
29-11-2016
Publisher
Springer London
Published in
Neural Computing and Applications / Issue 3/2018
Print ISSN: 0941-0643
Electronic ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-016-2726-5

Other articles of this Issue 3/2018

Neural Computing and Applications 3/2018 Go to the issue

Premium Partner