Skip to main content

2010 | OriginalPaper | Buchkapitel

4. Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems

verfasst von : Deniz Erdogmus, Seungju Han, Abhishek Singh

Erschienen in: Information Theoretic Learning

Verlag: Springer New York

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This chapter develops several batch and online learning algorithms for the error entropy criterion (EEC) that are counterparts to the most widely used algorithms for the mean square error criterion (MSE). Because the chapter assumes knowledge of adaptive filter design, readers unfamiliar with this topic should seek a textbook such as [332] or [253] for a review of fundamentals. But the treatment does not require an in-depth knowledge of this field. The case studies in this chapter address only adaptation of linear systems, not because entropic costs are particularly useful for the linear model, but because the solutions for linear systems are well understood and performance comparisons can be easily drawn. This chapter also considers applications of fast evaluations of the IP using the fast Gauss transform and incomplete Cholesky decomposition, and ends with an application of the error correntropy criterion (ECC) to adaptive noise cancellation.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975. Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.
2.
Zurück zum Zitat Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.CrossRefMATHMathSciNet Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.CrossRefMATHMathSciNet
88.
Zurück zum Zitat Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.CrossRef Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.CrossRef
90.
Zurück zum Zitat Erdogmus D., Principe J., Kim S., Sanchez J., A recursive Renyi’s entropy estimator, Proc. IEEE Workshop on Neural Networks for Signal Process., Martigni, Switzerland, pp. 209–217, 2002. Erdogmus D., Principe J., Kim S., Sanchez J., A recursive Renyi’s entropy estimator, Proc. IEEE Workshop on Neural Networks for Signal Process., Martigni, Switzerland, pp. 209–217, 2002.
116.
Zurück zum Zitat Golub G., Van Loan C., Matrix Computation, 3rd ed. The Johns Hopkins University Press, Baltimore, Maryland, 1996. Golub G., Van Loan C., Matrix Computation, 3rd ed. The Johns Hopkins University Press, Baltimore, Maryland, 1996.
118.
128.
Zurück zum Zitat Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745. Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745.
129.
Zurück zum Zitat Han S., Principe J., A Fixed-point Minimum Error Entropy Algorithm, in Proc. IEEE Int. Workshop on Machine Learning for Signal Processing, Maynooth, Ireland, 2006. Han S., Principe J., A Fixed-point Minimum Error Entropy Algorithm, in Proc. IEEE Int. Workshop on Machine Learning for Signal Processing, Maynooth, Ireland, 2006.
130.
Zurück zum Zitat Han S., Rao S., Jeong K., Principe J., A normalized minimum error entropy stochastic algorithm”, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulouse, 2006. Han S., Rao S., Jeong K., Principe J., A normalized minimum error entropy stochastic algorithm”, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulouse, 2006.
131.
Zurück zum Zitat Han S., A Family of Minimum Renyi’s Error Entropy Algorithm for Information Processing, Ph.D. dissertation, University of Florida, Summer, 2007. Han S., A Family of Minimum Renyi’s Error Entropy Algorithm for Information Processing, Ph.D. dissertation, University of Florida, Summer, 2007.
143.
Zurück zum Zitat Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002. Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.
192.
Zurück zum Zitat Kushner H., Yin G., Stochastic approximation and recursive algorithms and applications, Application of Mathematics series, vol. 35, Springer, New York, 2003. Kushner H., Yin G., Stochastic approximation and recursive algorithms and applications, Application of Mathematics series, vol. 35, Springer, New York, 2003.
206.
Zurück zum Zitat Lyapunov A. Stability of motion, Academic Press, New York, 1966MATH Lyapunov A. Stability of motion, Academic Press, New York, 1966MATH
253.
Zurück zum Zitat Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000. Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.
258.
Zurück zum Zitat Rao Y., D. Erdogmus, G.Y. Rao, J.C. Principe, Fast error whitening algorithms for system identification and control with noisy data, Neurocomputing, 69:158–181, 2006.CrossRef Rao Y., D. Erdogmus, G.Y. Rao, J.C. Principe, Fast error whitening algorithms for system identification and control with noisy data, Neurocomputing, 69:158–181, 2006.CrossRef
302.
Zurück zum Zitat Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009. Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.
332.
Zurück zum Zitat Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.MATH Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.MATH
Metadaten
Titel
Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems
verfasst von
Deniz Erdogmus
Seungju Han
Abhishek Singh
Copyright-Jahr
2010
Verlag
Springer New York
DOI
https://doi.org/10.1007/978-1-4419-1570-2_4

Premium Partner