Skip to main content
Top

2010 | OriginalPaper | Chapter

3. Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria

Authors : Deniz Erdogmus, Weifeng Liu

Published in: Information Theoretic Learning

Publisher: Springer New York

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This chapter formulates a new cost function for adaptive filtering based on Renyi’s quadratic error entropy. The problem of estimating the linear system parameters \(\mathrm{\mathbf {w}} = {[{w}_{0},\ldots, {w}_{M-1}]}^{\mathrm{T}}\) in the setting of Figure 3.1 where x(n), and z(n) are random variables can be framed as model-based inference, because it relates measured data, uncertainty, and the functional description of the system and its parameters. The desired response z(n) can be thought of as being created by an unknown transformation of the input vector \(\mathrm{\mathbf {x}} = {[x(n),\ldots, x(n - M + 1)]}^{\mathrm{T}}\). Adaptive filtering theory [143, 284] addresses this problem using the MSE criterion applied to the error signal, \(e(n) = z(n) - f(\mathrm{\mathbf {w}},x(n))\)
$${J}_{w}(e(n)) = E[{(z(n) - f(\mathrm{\mathbf{w}},x(n)))}^{2}]$$
(3.1)
when the linear filter is a finite impulse response filter (FIR);
$$y(n) =\sum\limits_{k=0}^{M-1}{w}_{ k}x(n - k).$$
(3.2)

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975. Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.
2.
go back to reference Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.CrossRefMATHMathSciNet Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.CrossRefMATHMathSciNet
4.
go back to reference Al-Naffouri T., Zerguine A., Bettayeb M., A unifying view of error nonlinearities in LMS adaptation, in Proc. ICASSP, vol. III, Seattle, pp. 1697–1700, May 1998. Al-Naffouri T., Zerguine A., Bettayeb M., A unifying view of error nonlinearities in LMS adaptation, in Proc. ICASSP, vol. III, Seattle, pp. 1697–1700, May 1998.
6.
go back to reference Amari S., Nagoata H., Methods of information geometry, Mathematical Monographs, vol. 191, American Mathematical Society, Providence RI, 2000. Amari S., Nagoata H., Methods of information geometry, Mathematical Monographs, vol. 191, American Mathematical Society, Providence RI, 2000.
53.
go back to reference Chen B., Hu J., Pu L., Sun Z., Stochastic gradient algorithm under (h, ϕ)-entropy criterion, Circuits Syst. Signal Process., 26:941–960, 2007.CrossRefMATHMathSciNet Chen B., Hu J., Pu L., Sun Z., Stochastic gradient algorithm under (h, ϕ)-entropy criterion, Circuits Syst. Signal Process., 26:941–960, 2007.CrossRefMATHMathSciNet
78.
go back to reference Douglas S., Meng H., Stochastic gradient adaptation under general error criteria, IEEE Trans. Signal Process., 42:1335–1351, 1994.CrossRef Douglas S., Meng H., Stochastic gradient adaptation under general error criteria, IEEE Trans. Signal Process., 42:1335–1351, 1994.CrossRef
84.
go back to reference Edmonson W., Srinivasan K., Wang C., Principe J. A global least square algorithm for adaptive IIR filtering, IEEE Trans. Circuits Syst., 45(3):379–384, 1996.CrossRef Edmonson W., Srinivasan K., Wang C., Principe J. A global least square algorithm for adaptive IIR filtering, IEEE Trans. Circuits Syst., 45(3):379–384, 1996.CrossRef
87.
go back to reference Erdogmus D., Principe J.C., An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems, IEEE Trans. Signal Process., 50(7):1780–1786, 2002.CrossRefMathSciNet Erdogmus D., Principe J.C., An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems, IEEE Trans. Signal Process., 50(7):1780–1786, 2002.CrossRefMathSciNet
88.
go back to reference Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.CrossRef Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.CrossRef
104.
go back to reference Fox J., An R and S Companion to Applied Regression, Sage, London, 2002. Fox J., An R and S Companion to Applied Regression, Sage, London, 2002.
127.
go back to reference Hampel, F. R., Ronchetti E. M., Rousseau P. J., Stahel W. A., Robust Statistics: The Approach Based on Influence Functions. Wiley, New York, 1985. Hampel, F. R., Ronchetti E. M., Rousseau P. J., Stahel W. A., Robust Statistics: The Approach Based on Influence Functions. Wiley, New York, 1985.
133.
go back to reference Hardle W., Applied Nonparametric Regression, Econometric Society Monographs vol 19, Cambridge University Press, New York, 1990. Hardle W., Applied Nonparametric Regression, Econometric Society Monographs vol 19, Cambridge University Press, New York, 1990.
143.
go back to reference Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002. Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.
165.
go back to reference Jenssen R., Erdogmus D., Hild II K., Principe J., Eltoft T., Information cut for clustering using a gradient descent approach, Pattern Recogn., 40:796–806, 2006.CrossRef Jenssen R., Erdogmus D., Hild II K., Principe J., Eltoft T., Information cut for clustering using a gradient descent approach, Pattern Recogn., 40:796–806, 2006.CrossRef
200.
go back to reference Liu W., Pokharel P., Principe J., Error entropy, correntropy and M-estimation, IEEE Int. Workshop on Machine Learning for Signal Processing, 2006. Liu W., Pokharel P., Principe J., Error entropy, correntropy and M-estimation, IEEE Int. Workshop on Machine Learning for Signal Processing, 2006.
201.
go back to reference Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.CrossRefMathSciNet Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.CrossRefMathSciNet
220.
go back to reference Middleton D., Statistical-physical models of electromagnetic interference, IEEE Trans. Electromagn. Compat., EMC-19(3):106–126, Aug. 1977.CrossRef Middleton D., Statistical-physical models of electromagnetic interference, IEEE Trans. Electromagn. Compat., EMC-19(3):106–126, Aug. 1977.CrossRef
223.
go back to reference Morejon R., An information theoretic approach to sonar automatic target recognition, Ph.D. dissertation, University of Florida, Spring 2003 Morejon R., An information theoretic approach to sonar automatic target recognition, Ph.D. dissertation, University of Florida, Spring 2003
244.
go back to reference Pei S., Tseng C., Least mean p-power error criterion for adaptive FIR filter, IEEE J. Selected Areas Commun., 12(9):1540–1547, 1994.CrossRef Pei S., Tseng C., Least mean p-power error criterion for adaptive FIR filter, IEEE J. Selected Areas Commun., 12(9):1540–1547, 1994.CrossRef
275.
284.
go back to reference Sayed A., Fundamentals of Adaptive Filters, John Wiley & Son, New York, 2003 Sayed A., Fundamentals of Adaptive Filters, John Wiley & Son, New York, 2003
297.
go back to reference Sidak Z., Sen P., Hajek J., Theory of Rank Tests, Academic Press, London, 1999.MATH Sidak Z., Sen P., Hajek J., Theory of Rank Tests, Academic Press, London, 1999.MATH
302.
go back to reference Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009. Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.
310.
go back to reference Styblinski M., Tang T., Experiments in nonconvex optimization: Stochastic approximation with function smoothing and simulated annealing, Neural Netw., 3: 467–483, 1990.CrossRef Styblinski M., Tang T., Experiments in nonconvex optimization: Stochastic approximation with function smoothing and simulated annealing, Neural Netw., 3: 467–483, 1990.CrossRef
313.
go back to reference Tanrikuku O., Chambers J., Convergence and steady-state properties of the least-mean mixed norm (LMMN) adaptive algorithm, IEE Proc. -Vision, Image Signal Process., 143: 137–142, June 1996.CrossRef Tanrikuku O., Chambers J., Convergence and steady-state properties of the least-mean mixed norm (LMMN) adaptive algorithm, IEE Proc. -Vision, Image Signal Process., 143: 137–142, June 1996.CrossRef
328.
go back to reference Walach E., Widrow B., The least mean fourth (LMF) adaptive algorithm and its family, IEEE Trans. Inf. Theor., IT-30(2):275–283, 1984.CrossRef Walach E., Widrow B., The least mean fourth (LMF) adaptive algorithm and its family, IEEE Trans. Inf. Theor., IT-30(2):275–283, 1984.CrossRef
332.
go back to reference Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.MATH Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.MATH
Metadata
Title
Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria
Authors
Deniz Erdogmus
Weifeng Liu
Copyright Year
2010
Publisher
Springer New York
DOI
https://doi.org/10.1007/978-1-4419-1570-2_3

Premium Partner