Skip to main content
Erschienen in: Neural Computing and Applications 24/2020

18.02.2019 | WSOM 2017

An energy-based SOM model not requiring periodic boundary conditions

verfasst von: Alexander Gepperth

Erschienen in: Neural Computing and Applications | Ausgabe 24/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We present the Resilient Self-organizing Tissue (ReST) model, a self-organized neural model based on an infinitely often continuously differentiable (\(C^\infty\)) energy function. ReST extends older work on energy-based self-organizing models (IEEE international conference on neural networks, IEEE, pp 1219–1223, 1993) in several ways. First of all, it converts input–prototype distances into neural activities that are constrained to follow a log-normal distribution. This allows a problem-independent interpretation of neural activities which facilitates, e.g. outlier detection and visualization. And secondly, since all neural activities are constrained in particular to exhibit a predetermined temporal mean, the convolution that is contained in the energy function can be performed using the so-called zero-padding with correction (ZPC) instead of periodic boundary conditions. Since periodic boundary conditions impose much stronger constraints on prototypes, using ReST with ZPC leads to markedly lower quantization errors, especially for small map sizes. Additional experiments are conducted showing the worth of a \(C^\infty\) energy function, namely for novelty detection and automatic control of SOM parameters.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M, Ghemawat S, Irving G, Isard M et al (2016) Tensorflow: a system for large-scale machine learning. OSDI 16:265–283 Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M, Ghemawat S, Irving G, Isard M et al (2016) Tensorflow: a system for large-scale machine learning. OSDI 16:265–283
2.
Zurück zum Zitat Acharya S, Pant AK, Gyawali PK (2015) Deep learning based large scale handwritten Devanagari character recognition. 2015 9th International conference on software, knowledge, information management and applications (SKIMA). IEEE, 2015 Acharya S, Pant AK, Gyawali PK (2015) Deep learning based large scale handwritten Devanagari character recognition. 2015 9th International conference on software, knowledge, information management and applications (SKIMA). IEEE, 2015
3.
Zurück zum Zitat Bunte K, Haase S, Biehl M, Villmann T (2012) Stochastic neighbor embedding (SNE) for dimension reduction and visualization using arbitrary divergences. Neurocomputing 90:23–45 Advances in artificial neural networks, machine learning, and computational intelligence (ESANN 2011)CrossRef Bunte K, Haase S, Biehl M, Villmann T (2012) Stochastic neighbor embedding (SNE) for dimension reduction and visualization using arbitrary divergences. Neurocomputing 90:23–45 Advances in artificial neural networks, machine learning, and computational intelligence (ESANN 2011)CrossRef
4.
Zurück zum Zitat Cohen G, Afshar S, Tapson J, Van Schaik A (2017) EMNIST: extending MNIST to handwritten letters. In: Proceedings of the international joint conference on neural networks 2017, May, pp 2921–2926 Cohen G, Afshar S, Tapson J, Van Schaik A (2017) EMNIST: extending MNIST to handwritten letters. In: Proceedings of the international joint conference on neural networks 2017, May, pp 2921–2926
5.
Zurück zum Zitat Cottrell M, Fort J-C, Pagès G (1998) Theoretical aspects of the SOM algorithm. Neurocomputing 21(1):119–138CrossRef Cottrell M, Fort J-C, Pagès G (1998) Theoretical aspects of the SOM algorithm. Neurocomputing 21(1):119–138CrossRef
6.
Zurück zum Zitat Erwin E, Obermayer K, Schulten K (1992) Self-organizing maps: ordering, convergence properties and energy functions. Biol Cybern 67(1):47–55CrossRef Erwin E, Obermayer K, Schulten K (1992) Self-organizing maps: ordering, convergence properties and energy functions. Biol Cybern 67(1):47–55CrossRef
7.
Zurück zum Zitat Flexer A (2001) On the use of self-organizing maps for clustering and visualization. Intell Data Anal 5(5):373–384CrossRef Flexer A (2001) On the use of self-organizing maps for clustering and visualization. Intell Data Anal 5(5):373–384CrossRef
8.
Zurück zum Zitat Gepperth A, Karaoguz CA (2016) Bio-inspired incremental learning architecture for applied perceptual problems. Cogn Comput 8(5):924–934CrossRef Gepperth A, Karaoguz CA (2016) Bio-inspired incremental learning architecture for applied perceptual problems. Cogn Comput 8(5):924–934CrossRef
10.
Zurück zum Zitat Graepel T, Burger M, Obermayer K (1998) Self-organizing maps: generalizations and new optimization techniques. Neurocomputing 21(1–3):173–190CrossRef Graepel T, Burger M, Obermayer K (1998) Self-organizing maps: generalizations and new optimization techniques. Neurocomputing 21(1–3):173–190CrossRef
11.
Zurück zum Zitat Heskes TM, Kappen B (1993) Error potentials for self-organization. In: IEEE international conference on neural networks, 1993, IEEE, pp 1219–1223 Heskes TM, Kappen B (1993) Error potentials for self-organization. In: IEEE international conference on neural networks, 1993, IEEE, pp 1219–1223
12.
Zurück zum Zitat Heskes T (1999) Energy functions for self-organizing maps. In: Oja E, Kaski S (eds) Kohonen maps. Elsevier, Amsterdam, pp 303–315CrossRef Heskes T (1999) Energy functions for self-organizing maps. In: Oja E, Kaski S (eds) Kohonen maps. Elsevier, Amsterdam, pp 303–315CrossRef
13.
Zurück zum Zitat Jähne B (2005) Digital image processing, 6th edn. Springer, BerlinMATH Jähne B (2005) Digital image processing, 6th edn. Springer, BerlinMATH
14.
15.
Zurück zum Zitat LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef
16.
Zurück zum Zitat Lefort M, Hecht T, Gepperth A(2015) Using self-organizing maps for regression: the importance of the output function. In: European symposium on artificial neural networks (ESANN) Lefort M, Hecht T, Gepperth A(2015) Using self-organizing maps for regression: the importance of the output function. In: European symposium on artificial neural networks (ESANN)
17.
Zurück zum Zitat Shieh S-L, Liao I-E (2012) A new approach for data clustering and visualization using self-organizing maps. Expert Syst Appl 39(15):11924–11933CrossRef Shieh S-L, Liao I-E (2012) A new approach for data clustering and visualization using self-organizing maps. Expert Syst Appl 39(15):11924–11933CrossRef
18.
Zurück zum Zitat Tolat V (1990) An analysis of Kohonen’s self-organizing maps using a system of energy functions. Biol Cybern 64(2):155–164CrossRef Tolat V (1990) An analysis of Kohonen’s self-organizing maps using a system of energy functions. Biol Cybern 64(2):155–164CrossRef
19.
Zurück zum Zitat Van der Maaten L (2014) Accelerating t-SNE using tree-based algorithms. J Mach Learn Res 15(1):3221–3245MathSciNetMATH Van der Maaten L (2014) Accelerating t-SNE using tree-based algorithms. J Mach Learn Res 15(1):3221–3245MathSciNetMATH
20.
Zurück zum Zitat Van der Maaten L, Hinton G (2008) Visualizing data using \(t\)-SNE. J Mach Learn Res 9:2579–2605MATH Van der Maaten L, Hinton G (2008) Visualizing data using \(t\)-SNE. J Mach Learn Res 9:2579–2605MATH
21.
Zurück zum Zitat Vesanto J (1999) SOM-based data visualization methods. Intell Data Anal 3(2):111–126CrossRef Vesanto J (1999) SOM-based data visualization methods. Intell Data Anal 3(2):111–126CrossRef
22.
Zurück zum Zitat Xiao H, Rasul K, Vollgraf R (2017) Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747 Xiao H, Rasul K, Vollgraf R (2017) Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. arXiv:​1708.​07747
Metadaten
Titel
An energy-based SOM model not requiring periodic boundary conditions
verfasst von
Alexander Gepperth
Publikationsdatum
18.02.2019
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 24/2020
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-019-04028-9

Weitere Artikel der Ausgabe 24/2020

Neural Computing and Applications 24/2020 Zur Ausgabe

S.I. : Developing nature-inspired intelligence by neural systems

Enhanced robustness of convolutional networks with a push–pull inhibition layer