Skip to main content
Top

2014 | OriginalPaper | Chapter

41. Nonlinear Adaptive Filtering in Kernel Spaces

Authors : Badong Chen, Lin Li, Weifeng Liu, José C. Príncipe

Published in: Springer Handbook of Bio-/Neuroinformatics

Publisher: Springer Berlin Heidelberg

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Recently, a family of online kernel-learning algorithms, known as the kernel adaptive filtering (KAF) algorithms, has become an emerging area of research. The KAF algorithms are developed in reproducing kernel Hilbert spaces (RKHS), by using the linear structure of this space to implement well-established linear adaptive algorithms and to obtain nonlinear filters in the original input space. These algorithms include the kernel least mean squares (KLMS), kernel affine projection algorithms (KAPA), kernel recursive least squares (KRLS), and extended kernel recursive least squares (EX-KRLS), etc. When the kernels are radial (such as the Gaussian kernel), they naturally build a growing RBF network, where the weights are directly related to the errors in each sample. The aim of this chapter is to give a brief introduction to kernel adaptive filters. In particular, our focus is on KLMS, the simplest KAF algorithm, which is easy to implement, yet efficient. Several key aspects of the algorithm are discussed, such as self-regularization, sparsification, quantization, and the mean-square convergence. Application examples are also presented, including in particular the adaptive neural decoder for spike trains.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
41.1.
go back to reference B. Widrow, S.D. Stearns: Adaptive Signal Processing (Englewood Cliffs, NJ: Prentice-Hall 1985)MATH B. Widrow, S.D. Stearns: Adaptive Signal Processing (Englewood Cliffs, NJ: Prentice-Hall 1985)MATH
41.2.
go back to reference S. Haykin: Adaptive Filtering Theory, 3rd edn. (Prentice Hall, New York 1996) S. Haykin: Adaptive Filtering Theory, 3rd edn. (Prentice Hall, New York 1996)
41.3.
go back to reference A.H. Sayed: Fundamentals of Adaptive Filtering (Wiley, Hoboken 2003) A.H. Sayed: Fundamentals of Adaptive Filtering (Wiley, Hoboken 2003)
41.4.
go back to reference B. Hassibi, A.H. Sayed, T. Kailath: The H ∞ optimality of the LMS algorithm, IEEE Trans. Signal Process. 44, 267–280 (1996)CrossRef B. Hassibi, A.H. Sayed, T. Kailath: The H optimality of the LMS algorithm, IEEE Trans. Signal Process. 44, 267–280 (1996)CrossRef
41.5.
go back to reference S.S. Narayan, A.M. Peterson, M.J. Narashima: Transform domain LMS algorithm, IEEE Trans. Acoust. Speech Signal Process. ASSP-31, 609–615 (1983)CrossRef S.S. Narayan, A.M. Peterson, M.J. Narashima: Transform domain LMS algorithm, IEEE Trans. Acoust. Speech Signal Process. ASSP-31, 609–615 (1983)CrossRef
41.6.
go back to reference F. Beaufays: Transform-domain adaptive filters: An analytical approach, IEEE Trans. Signal Process. 43, 422–431 (1995)CrossRef F. Beaufays: Transform-domain adaptive filters: An analytical approach, IEEE Trans. Signal Process. 43, 422–431 (1995)CrossRef
41.7.
go back to reference S. Haykin, A.H. Sayed, J.R. Zeidler, P. Yee, P.C. Wei: Adaptive tracking of linear time variant systems by extended RLS algorithm, IEEE Trans. Signal Process. 45, 1118–1128 (1997)CrossRef S. Haykin, A.H. Sayed, J.R. Zeidler, P. Yee, P.C. Wei: Adaptive tracking of linear time variant systems by extended RLS algorithm, IEEE Trans. Signal Process. 45, 1118–1128 (1997)CrossRef
41.8.
go back to reference A.H. Sayed, T. Kailath: A state-space approach to adaptive RLS filtering, IEEE Signal Process. Mag. 11, 18–60 (1994)CrossRef A.H. Sayed, T. Kailath: A state-space approach to adaptive RLS filtering, IEEE Signal Process. Mag. 11, 18–60 (1994)CrossRef
41.9.
go back to reference B. Anderson, J. Moor: Optimal Filtering (Prentice-Hall, New York 1979) B. Anderson, J. Moor: Optimal Filtering (Prentice-Hall, New York 1979)
41.10.
go back to reference S. Billings, S. Fakhouri: Identification of systems containing linear dynamics and static nonlinear elements, Automatica 18, 15–26 (1982)MathSciNetCrossRefMATH S. Billings, S. Fakhouri: Identification of systems containing linear dynamics and static nonlinear elements, Automatica 18, 15–26 (1982)MathSciNetCrossRefMATH
41.11.
go back to reference D. Gabor: Holographic model of temporal recall, Nature 217, 584–585 (1968)CrossRef D. Gabor: Holographic model of temporal recall, Nature 217, 584–585 (1968)CrossRef
41.12.
go back to reference J.F. Barrett: The use of functionals in the analysis of non-linear physical systems, Int. J. Electron. 15, 567–615 (1963) J.F. Barrett: The use of functionals in the analysis of non-linear physical systems, Int. J. Electron. 15, 567–615 (1963)
41.13.
go back to reference S. Haykin: Neural Networks: A Comprehensive Foundation, 2nd edn. (Prentice-Hall, Upper Saddle River 1998)MATH S. Haykin: Neural Networks: A Comprehensive Foundation, 2nd edn. (Prentice-Hall, Upper Saddle River 1998)MATH
41.14.
go back to reference J.C. Príncipe, B. de Vries, J.M. Kuo, P.G. de Oliveira: Modeling applications with the focused gamma net, Adv. Neural Inform. Process. Syst. 4, 143–150 (1992) J.C. Príncipe, B. de Vries, J.M. Kuo, P.G. de Oliveira: Modeling applications with the focused gamma net, Adv. Neural Inform. Process. Syst. 4, 143–150 (1992)
41.15.
41.16.
go back to reference B. Scholkopf, A.J. Smola: Learning with Kernels, Support Vector Machines, Regularization, Optimization and Beyond (MIT Press, Cambridge 2002) B. Scholkopf, A.J. Smola: Learning with Kernels, Support Vector Machines, Regularization, Optimization and Beyond (MIT Press, Cambridge 2002)
41.17.
go back to reference F. Girosi, M. Jones, T. Poggio: Regularization theory and neural networks architectures, Neural Comput. 7, 219–269 (1995)CrossRef F. Girosi, M. Jones, T. Poggio: Regularization theory and neural networks architectures, Neural Comput. 7, 219–269 (1995)CrossRef
41.18.
go back to reference B. Scholkopf, A.J. Smola, K. Muller: Nonlinear component analysis as a kernel eigenvalue problem, Neural Comput. 10, 1299–1319 (1998)CrossRef B. Scholkopf, A.J. Smola, K. Muller: Nonlinear component analysis as a kernel eigenvalue problem, Neural Comput. 10, 1299–1319 (1998)CrossRef
41.19.
go back to reference M.H. Yang: Kernel eigenfaces vs kernel fisherfaces: Face recognition using kernel methods, Proc. 5th IEEE ICAFGR (Washington, 2002) pp. 215–220 M.H. Yang: Kernel eigenfaces vs kernel fisherfaces: Face recognition using kernel methods, Proc. 5th IEEE ICAFGR (Washington, 2002) pp. 215–220
41.20.
go back to reference W. Liu, J.C. Príncipe, S. Haykin: Kernel Adaptive Filtering: A Comprehensive Introduction (Wiley, Hoboken 2010)CrossRef W. Liu, J.C. Príncipe, S. Haykin: Kernel Adaptive Filtering: A Comprehensive Introduction (Wiley, Hoboken 2010)CrossRef
41.21.
go back to reference A.R.C. Paiva, I. Park, J.C. Príncipe: A reproducing kernel Hilbert space framework for spike train signal processing, Neural Comput. 21, 424–449 (2009)MathSciNetCrossRefMATH A.R.C. Paiva, I. Park, J.C. Príncipe: A reproducing kernel Hilbert space framework for spike train signal processing, Neural Comput. 21, 424–449 (2009)MathSciNetCrossRefMATH
41.22.
go back to reference L. Li, I. Park, S. Seth, J.S. Choi, J.T. Francis, J.C. Sanchez, J.C. Príncipe: An adaptive decoder from spike trains to micro-stimulation using kernel least-mean-squares (KLMS), Mach. Learn. Signal Process. (MLSP), IEEE Int. Workshop (Beijing 2011) pp. 1–6 L. Li, I. Park, S. Seth, J.S. Choi, J.T. Francis, J.C. Sanchez, J.C. Príncipe: An adaptive decoder from spike trains to micro-stimulation using kernel least-mean-squares (KLMS), Mach. Learn. Signal Process. (MLSP), IEEE Int. Workshop (Beijing 2011) pp. 1–6
41.23.
go back to reference W. Liu, P. Pokharel, J. Príncipe: The kernel least mean square algorithm, IEEE Trans. Signal Process. 56, 543–554 (2008)MathSciNetCrossRef W. Liu, P. Pokharel, J. Príncipe: The kernel least mean square algorithm, IEEE Trans. Signal Process. 56, 543–554 (2008)MathSciNetCrossRef
41.24.
go back to reference W. Liu, J. Príncipe: Kernel affine projection algorithm, EURASIP J. Adv. Signal Process. 12, 784292 (2008)CrossRefMATH W. Liu, J. Príncipe: Kernel affine projection algorithm, EURASIP J. Adv. Signal Process. 12, 784292 (2008)CrossRefMATH
41.25.
go back to reference Y. Engel, S. Mannor, R. Meir: The kernel recursive least-squares algorithm, IEEE Trans. Signal Process. 52, 2275–2285 (2004)MathSciNetCrossRef Y. Engel, S. Mannor, R. Meir: The kernel recursive least-squares algorithm, IEEE Trans. Signal Process. 52, 2275–2285 (2004)MathSciNetCrossRef
41.26.
go back to reference W. Liu, I. Park, Y. Wang, J.C. Príncipe: Extended kernel recursive least squares algorithm, IEEE Trans. Signal Process. 57, 3801–3814 (2009)MathSciNetCrossRef W. Liu, I. Park, Y. Wang, J.C. Príncipe: Extended kernel recursive least squares algorithm, IEEE Trans. Signal Process. 57, 3801–3814 (2009)MathSciNetCrossRef
41.27.
go back to reference B.W. Silverman: Density Estimation for Statistics and Data Analysis (Chapman Hall, New York 1986)CrossRefMATH B.W. Silverman: Density Estimation for Statistics and Data Analysis (Chapman Hall, New York 1986)CrossRefMATH
41.28.
go back to reference A. Tikhonov, V. Arsenin: Solution of ill-posed Problems (Winston, Washington 1977)MATH A. Tikhonov, V. Arsenin: Solution of ill-posed Problems (Winston, Washington 1977)MATH
41.29.
go back to reference G. Golub, C. Loan: Matrix Computations (John Hopkins University Press, Washington, DC 1996)MATH G. Golub, C. Loan: Matrix Computations (John Hopkins University Press, Washington, DC 1996)MATH
41.30.
41.31.
go back to reference L. Csato, M. Opper: Sparse online Gaussian process, Neural Comput. 14, 641–668 (2002)CrossRefMATH L. Csato, M. Opper: Sparse online Gaussian process, Neural Comput. 14, 641–668 (2002)CrossRefMATH
41.32.
go back to reference C. Richard, J.C.M. Bermudez, P. Honeine: Online prediction of time series data with kernels, IEEE Trans. Signal Process. 57, 1058–1066 (2009)MathSciNetCrossRef C. Richard, J.C.M. Bermudez, P. Honeine: Online prediction of time series data with kernels, IEEE Trans. Signal Process. 57, 1058–1066 (2009)MathSciNetCrossRef
41.33.
go back to reference W. Liu, I. Park, J.C. Príncipe: An information theoretic approach of designing sparse kernel adaptive filters, IEEE Trans. Neural Netw. 20, 1950–1961 (2009)CrossRef W. Liu, I. Park, J.C. Príncipe: An information theoretic approach of designing sparse kernel adaptive filters, IEEE Trans. Neural Netw. 20, 1950–1961 (2009)CrossRef
41.34.
go back to reference B. Chen, S. Zhao, P. Zhu, J.C. Príncipe: Quantized kernel least mean square algorithm, IEEE Trans. Neural Netw. Learn. Syst. 23(1), 22–32 (2012)CrossRef B. Chen, S. Zhao, P. Zhu, J.C. Príncipe: Quantized kernel least mean square algorithm, IEEE Trans. Neural Netw. Learn. Syst. 23(1), 22–32 (2012)CrossRef
41.35.
go back to reference Y.Y. Linde, A. Buzo, R.M. Gray: An algorithm for vector quantizer design, IEEE Trans. Commun. 28, 84–95 (1980)CrossRef Y.Y. Linde, A. Buzo, R.M. Gray: An algorithm for vector quantizer design, IEEE Trans. Commun. 28, 84–95 (1980)CrossRef
41.36.
go back to reference P.A. Chou, T. Lookabaugh, R.M. Gray: Entropy-constrained vector quantization, IEEE Trans. Acoust. Speech Signal Process. 37, 31–42 (1989)MathSciNetCrossRef P.A. Chou, T. Lookabaugh, R.M. Gray: Entropy-constrained vector quantization, IEEE Trans. Acoust. Speech Signal Process. 37, 31–42 (1989)MathSciNetCrossRef
41.37.
go back to reference T. Lehn-Schiøler, A. Hegde, D. Erdogmus, J.C. Principe: Vector quantization using information theoretic concepts, Nat. Comput. 4, 39–51 (2005)MathSciNetCrossRefMATH T. Lehn-Schiøler, A. Hegde, D. Erdogmus, J.C. Principe: Vector quantization using information theoretic concepts, Nat. Comput. 4, 39–51 (2005)MathSciNetCrossRefMATH
41.38.
go back to reference S. Craciun, D. Cheney, K. Gugel, J.C. Sanchez, J.C. Príncipe: Wireless transmission of neural signals using entropy and mutual information compression, IEEE Trans. Neural Syst. Rehabil. Eng. 19, 35–44 (2011)CrossRef S. Craciun, D. Cheney, K. Gugel, J.C. Sanchez, J.C. Príncipe: Wireless transmission of neural signals using entropy and mutual information compression, IEEE Trans. Neural Syst. Rehabil. Eng. 19, 35–44 (2011)CrossRef
41.39.
go back to reference N.R. Yousef, A.H. Sayed: A unified approach to the steady-state and tracking analysis of adaptive filters, IEEE Trans. Signal Process. 49, 314–324 (2001)CrossRef N.R. Yousef, A.H. Sayed: A unified approach to the steady-state and tracking analysis of adaptive filters, IEEE Trans. Signal Process. 49, 314–324 (2001)CrossRef
41.40.
go back to reference T.Y. Al-Naffouri, A.H. Sayed: Adaptive filters with error nonlinearities: Mean-square analysis and optimum design, EURASIP J. Appl. Signal Process. 4, 192–205 (2001)CrossRefMATH T.Y. Al-Naffouri, A.H. Sayed: Adaptive filters with error nonlinearities: Mean-square analysis and optimum design, EURASIP J. Appl. Signal Process. 4, 192–205 (2001)CrossRefMATH
41.41.
go back to reference T.Y. Al-Naffouri, A.H. Sayed: Transient analysis of data-normalized adaptive filters, IEEE Trans. Signal Process. 51, 639–652 (2003)CrossRef T.Y. Al-Naffouri, A.H. Sayed: Transient analysis of data-normalized adaptive filters, IEEE Trans. Signal Process. 51, 639–652 (2003)CrossRef
41.42.
go back to reference T.Y. Al-Naffouri, A.H. Sayed: Transient analysis of adaptive filters with error nonlinearities, IEEE Trans. Signal Process. 51, 653–663 (2003)CrossRef T.Y. Al-Naffouri, A.H. Sayed: Transient analysis of adaptive filters with error nonlinearities, IEEE Trans. Signal Process. 51, 653–663 (2003)CrossRef
41.43.
go back to reference W. Sethares, C.R. Johnson: A comparison of two quantized state adaptive algorithms, IEEE Trans. Acoust. Speech Signal Process. 37, 138–143 (1989)CrossRef W. Sethares, C.R. Johnson: A comparison of two quantized state adaptive algorithms, IEEE Trans. Acoust. Speech Signal Process. 37, 138–143 (1989)CrossRef
41.44.
go back to reference T. Poggio, F. Girosi: Networks for approximation and learning, Proc. IEEE 78(9), 1481–1497 (1990)CrossRefMATH T. Poggio, F. Girosi: Networks for approximation and learning, Proc. IEEE 78(9), 1481–1497 (1990)CrossRefMATH
41.45.
go back to reference D.R. Brillinger: Maximum likelihood analysis of spike trains of interacting nerve cells, Biol. Cybern. 59, 189–200 (1988)CrossRefMATH D.R. Brillinger: Maximum likelihood analysis of spike trains of interacting nerve cells, Biol. Cybern. 59, 189–200 (1988)CrossRefMATH
41.46.
go back to reference Z. Mainen, T. Sejnowski: Reliably of spike timing in neocortical neurons, Science 268, 1503–1506 (1995)CrossRef Z. Mainen, T. Sejnowski: Reliably of spike timing in neocortical neurons, Science 268, 1503–1506 (1995)CrossRef
41.47.
go back to reference I. Park: Capturing Spike Train Similarity Structure: A Point Process Divergence Approach. Ph.D. Thesis (Univ. of Florida, Gainesville 2010) I. Park: Capturing Spike Train Similarity Structure: A Point Process Divergence Approach. Ph.D. Thesis (Univ. of Florida, Gainesville 2010)
41.48.
go back to reference L. Paninski, J. Pillow, J. Lewi: Statistical models for neural encoding, decoding, and optimal stimulus design, Prog. Brain Res. 165, 493–507 (2007)CrossRef L. Paninski, J. Pillow, J. Lewi: Statistical models for neural encoding, decoding, and optimal stimulus design, Prog. Brain Res. 165, 493–507 (2007)CrossRef
41.49.
go back to reference E.N. Brown, L.M. Frank, D. Tang, M.C. Quirk, M.A. Wilson: A statistical paradigm for neural spike train decoding applied to position prediction from ensemble firing patterns of rat hippocampal place cells, J. Neurosci. 18, 7411–7425 (1998) E.N. Brown, L.M. Frank, D. Tang, M.C. Quirk, M.A. Wilson: A statistical paradigm for neural spike train decoding applied to position prediction from ensemble firing patterns of rat hippocampal place cells, J. Neurosci. 18, 7411–7425 (1998)
41.50.
go back to reference J. Eichhorn, A. Tolias, E. Zien, M. Kuss, C.E. Rasmussen, J. Weston, N. Logothetis, B. Scholkopf: Prediction on spike data using kernel algorithms, Adv. Neural Inform. Process. Syst. 16, 1367–1374 (2004) J. Eichhorn, A. Tolias, E. Zien, M. Kuss, C.E. Rasmussen, J. Weston, N. Logothetis, B. Scholkopf: Prediction on spike data using kernel algorithms, Adv. Neural Inform. Process. Syst. 16, 1367–1374 (2004)
41.51.
go back to reference W. Maass, T. Natschlager, H. Markram: Real-time computing without stable states: A new framework for neural computation based on perturbations, Neural Comput. 14, 2531–2560 (2002)CrossRefMATH W. Maass, T. Natschlager, H. Markram: Real-time computing without stable states: A new framework for neural computation based on perturbations, Neural Comput. 14, 2531–2560 (2002)CrossRefMATH
41.52.
go back to reference S. Seth, A.J. Brockmeier, J.S. Choi, M. Semework, J.T. Francis, J.C. Príncipe: Evaluating dependence in spike train metric spaces, Int. Jt. Conf. Neural Netw. (2011) S. Seth, A.J. Brockmeier, J.S. Choi, M. Semework, J.T. Francis, J.C. Príncipe: Evaluating dependence in spike train metric spaces, Int. Jt. Conf. Neural Netw. (2011)
Metadata
Title
Nonlinear Adaptive Filtering in Kernel Spaces
Authors
Badong Chen
Lin Li
Weifeng Liu
José C. Príncipe
Copyright Year
2014
DOI
https://doi.org/10.1007/978-3-642-30574-0_41

Premium Partner