Skip to main content

2016 | OriginalPaper | Buchkapitel

Solving the Vanishing Information Problem with Repeated Potential Mutual Information Maximization

verfasst von : Ryotaro Kamimura

Erschienen in: Neural Information Processing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The present paper shows how to solve the problem of vanishing information in potential mutual information maximization. We have previously developed a new information-theoretic method called “potential learning” which aims to extract the most important features through simplified information maximization. However, one of the major problems is that the potential effect diminishes considerably in the course of learning and it becomes impossible to take into account the potentiality in learning. To solve this problem, we here introduce repeated information maximization. To enhance the processes of information maximization, the method forces the potentiality to be assimilated in learning every time it becomes ineffective. The method was applied to the on-line article popularity data set to estimate the popularity of articles. To demonstrate the effectiveness of the method, the number of hidden neurons was made excessively large and set to 50. The results show that the potentiality information maximization could increase mutual information even with 50 hidden neurons, and lead to improved generalization performance. In addition, simplified representations could be obtained for better interpretation and generalization.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
The first variable “timedelta” was deleted from the experiment.
 
Literatur
1.
Zurück zum Zitat Linsker, R.: Self-organization in a perceptual network. Computer 21(3), 105–117 (1988)CrossRef Linsker, R.: Self-organization in a perceptual network. Computer 21(3), 105–117 (1988)CrossRef
2.
Zurück zum Zitat Linsker, R.: How to generate ordered maps by maximizing the mutual information between input and output signals. Neural Comput. 1(3), 402–411 (1989)CrossRef Linsker, R.: How to generate ordered maps by maximizing the mutual information between input and output signals. Neural Comput. 1(3), 402–411 (1989)CrossRef
3.
Zurück zum Zitat Linsker, R.: Local synaptic learning rules suffice to maximize mutual information in a linear network. Neural Comput. 4(5), 691–702 (1992)CrossRef Linsker, R.: Local synaptic learning rules suffice to maximize mutual information in a linear network. Neural Comput. 4(5), 691–702 (1992)CrossRef
4.
Zurück zum Zitat Linsker, R.: Improved local learning rule for information maximization and related applications. Neural Netw. 18(3), 261–265 (2005)CrossRefMATH Linsker, R.: Improved local learning rule for information maximization and related applications. Neural Netw. 18(3), 261–265 (2005)CrossRefMATH
5.
Zurück zum Zitat Barlow, H.B.: Unsupervised learning. Neural Comput. 1(3), 295–311 (1989)CrossRef Barlow, H.B.: Unsupervised learning. Neural Comput. 1(3), 295–311 (1989)CrossRef
6.
Zurück zum Zitat Barlow, H.B., Kaushal, T.P., Mitchison, G.J.: Finding minimum entropy codes. Neural Comput. 1(3), 412–423 (1989)CrossRef Barlow, H.B., Kaushal, T.P., Mitchison, G.J.: Finding minimum entropy codes. Neural Comput. 1(3), 412–423 (1989)CrossRef
7.
Zurück zum Zitat Atick, J.J.: Could information theory provide an ecological theory of sensory processing? Netw. Comput. Neural Syst. 3(2), 213–251 (1992)CrossRefMATH Atick, J.J.: Could information theory provide an ecological theory of sensory processing? Netw. Comput. Neural Syst. 3(2), 213–251 (1992)CrossRefMATH
8.
Zurück zum Zitat Principe, J.C., Xu, D., Fisher, J.: Information theoretic learning. Unsuperv. Adapt. Filter. 1, 265–319 (2000)MATH Principe, J.C., Xu, D., Fisher, J.: Information theoretic learning. Unsuperv. Adapt. Filter. 1, 265–319 (2000)MATH
9.
Zurück zum Zitat Principe, J.C.: Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives. Springer Science & Business Media, New York (2010)CrossRefMATH Principe, J.C.: Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives. Springer Science & Business Media, New York (2010)CrossRefMATH
10.
Zurück zum Zitat Nenadic, Z.: Information discriminant analysis: feature extraction with an information-theoretic objective. IEEE Trans. Pattern Anal. Mach. Intell. 29(8), 1394–1407 (2007)CrossRef Nenadic, Z.: Information discriminant analysis: feature extraction with an information-theoretic objective. IEEE Trans. Pattern Anal. Mach. Intell. 29(8), 1394–1407 (2007)CrossRef
11.
Zurück zum Zitat Torkkola, K.: Nonlinear feature transforms using maximum mutual information. In: Proceedings of International Joint Conference on Neural Networks, IJCNN 2001, vol. 4, pp. 2756–2761. IEEE (2001) Torkkola, K.: Nonlinear feature transforms using maximum mutual information. In: Proceedings of International Joint Conference on Neural Networks, IJCNN 2001, vol. 4, pp. 2756–2761. IEEE (2001)
12.
Zurück zum Zitat Kamimura, R.: Self-organizing selective potentiality learning to detect important input neurons. In: 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 1619–1626. IEEE (2015) Kamimura, R.: Self-organizing selective potentiality learning to detect important input neurons. In: 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 1619–1626. IEEE (2015)
13.
Zurück zum Zitat Kamimura, R., Kitajima, R.: Selective potentiality maximization for input neuron selection in self-organizing maps. In: 2015 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2015) Kamimura, R., Kitajima, R.: Selective potentiality maximization for input neuron selection in self-organizing maps. In: 2015 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2015)
14.
Zurück zum Zitat Kamimura, R.: Supervised potentiality actualization learning for improving generalization performance. In: Proceedings on the International Conference on Artificial Intelligence (ICAI), p. 616. The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp) (2015) Kamimura, R.: Supervised potentiality actualization learning for improving generalization performance. In: Proceedings on the International Conference on Artificial Intelligence (ICAI), p. 616. The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp) (2015)
15.
Zurück zum Zitat Kitajima, R., Kamimura, R.: Simplifying potential learning by supposing maximum and minimum information for improved generalization and interpretation. In: International Conference on Modelling, Identification and Control. IASTED (2015) Kitajima, R., Kamimura, R.: Simplifying potential learning by supposing maximum and minimum information for improved generalization and interpretation. In: International Conference on Modelling, Identification and Control. IASTED (2015)
16.
Zurück zum Zitat Fernandes, K., Vinagre, P., Cortez, P.: A proactive intelligent decision support system for predicting the popularity of online news. In: Pereira, F., Machado, P., Costa, E., Cardoso, A. (eds.) EPIA 2015. LNCS, vol. 9273, pp. 535–546. Springer, Heidelberg (2015) Fernandes, K., Vinagre, P., Cortez, P.: A proactive intelligent decision support system for predicting the popularity of online news. In: Pereira, F., Machado, P., Costa, E., Cardoso, A. (eds.) EPIA 2015. LNCS, vol. 9273, pp. 535–546. Springer, Heidelberg (2015)
17.
Zurück zum Zitat Bache, K., Lichman, M.: UCI machine learning repository (2013) Bache, K., Lichman, M.: UCI machine learning repository (2013)
18.
Zurück zum Zitat Kamimura, R.: Repeated potentiality assimilation: simplifying learning procedures by positive, independent and indirect operation for improving generalization and interpretation. In: Proceedings of IJCNN-2016, Vancouver (2016, in press) Kamimura, R.: Repeated potentiality assimilation: simplifying learning procedures by positive, independent and indirect operation for improving generalization and interpretation. In: Proceedings of IJCNN-2016, Vancouver (2016, in press)
Metadaten
Titel
Solving the Vanishing Information Problem with Repeated Potential Mutual Information Maximization
verfasst von
Ryotaro Kamimura
Copyright-Jahr
2016
DOI
https://doi.org/10.1007/978-3-319-46681-1_53