Skip to main content
Top

2010 | OriginalPaper | Chapter

4. Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems

Authors : Deniz Erdogmus, Seungju Han, Abhishek Singh

Published in: Information Theoretic Learning

Publisher: Springer New York

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This chapter develops several batch and online learning algorithms for the error entropy criterion (EEC) that are counterparts to the most widely used algorithms for the mean square error criterion (MSE). Because the chapter assumes knowledge of adaptive filter design, readers unfamiliar with this topic should seek a textbook such as [332] or [253] for a review of fundamentals. But the treatment does not require an in-depth knowledge of this field. The case studies in this chapter address only adaptation of linear systems, not because entropic costs are particularly useful for the linear model, but because the solutions for linear systems are well understood and performance comparisons can be easily drawn. This chapter also considers applications of fast evaluations of the IP using the fast Gauss transform and incomplete Cholesky decomposition, and ends with an application of the error correntropy criterion (ECC) to adaptive noise cancellation.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975. Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.
2.
go back to reference Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.CrossRefMATHMathSciNet Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.CrossRefMATHMathSciNet
88.
go back to reference Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.CrossRef Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.CrossRef
90.
go back to reference Erdogmus D., Principe J., Kim S., Sanchez J., A recursive Renyi’s entropy estimator, Proc. IEEE Workshop on Neural Networks for Signal Process., Martigni, Switzerland, pp. 209–217, 2002. Erdogmus D., Principe J., Kim S., Sanchez J., A recursive Renyi’s entropy estimator, Proc. IEEE Workshop on Neural Networks for Signal Process., Martigni, Switzerland, pp. 209–217, 2002.
116.
go back to reference Golub G., Van Loan C., Matrix Computation, 3rd ed. The Johns Hopkins University Press, Baltimore, Maryland, 1996. Golub G., Van Loan C., Matrix Computation, 3rd ed. The Johns Hopkins University Press, Baltimore, Maryland, 1996.
118.
128.
go back to reference Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745. Han S., Rao S., Erdogmus D., Principe J., A minimum error entropy algorithm with self adjusting stepsize (MEE-SAS), Signal Process. 87:2733–2745.
129.
go back to reference Han S., Principe J., A Fixed-point Minimum Error Entropy Algorithm, in Proc. IEEE Int. Workshop on Machine Learning for Signal Processing, Maynooth, Ireland, 2006. Han S., Principe J., A Fixed-point Minimum Error Entropy Algorithm, in Proc. IEEE Int. Workshop on Machine Learning for Signal Processing, Maynooth, Ireland, 2006.
130.
go back to reference Han S., Rao S., Jeong K., Principe J., A normalized minimum error entropy stochastic algorithm”, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulouse, 2006. Han S., Rao S., Jeong K., Principe J., A normalized minimum error entropy stochastic algorithm”, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulouse, 2006.
131.
go back to reference Han S., A Family of Minimum Renyi’s Error Entropy Algorithm for Information Processing, Ph.D. dissertation, University of Florida, Summer, 2007. Han S., A Family of Minimum Renyi’s Error Entropy Algorithm for Information Processing, Ph.D. dissertation, University of Florida, Summer, 2007.
143.
go back to reference Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002. Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.
192.
go back to reference Kushner H., Yin G., Stochastic approximation and recursive algorithms and applications, Application of Mathematics series, vol. 35, Springer, New York, 2003. Kushner H., Yin G., Stochastic approximation and recursive algorithms and applications, Application of Mathematics series, vol. 35, Springer, New York, 2003.
206.
go back to reference Lyapunov A. Stability of motion, Academic Press, New York, 1966MATH Lyapunov A. Stability of motion, Academic Press, New York, 1966MATH
253.
go back to reference Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000. Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.
258.
go back to reference Rao Y., D. Erdogmus, G.Y. Rao, J.C. Principe, Fast error whitening algorithms for system identification and control with noisy data, Neurocomputing, 69:158–181, 2006.CrossRef Rao Y., D. Erdogmus, G.Y. Rao, J.C. Principe, Fast error whitening algorithms for system identification and control with noisy data, Neurocomputing, 69:158–181, 2006.CrossRef
302.
go back to reference Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009. Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.
332.
go back to reference Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.MATH Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.MATH
Metadata
Title
Algorithms for Entropy and Correntropy Adaptation with Applications to Linear Systems
Authors
Deniz Erdogmus
Seungju Han
Abhishek Singh
Copyright Year
2010
Publisher
Springer New York
DOI
https://doi.org/10.1007/978-1-4419-1570-2_4

Premium Partner