Skip to main content
Top

2021 | OriginalPaper | Chapter

\(\kappa \)-Circulant Maximum Variance Bases

Authors : Christopher Bonenberger, Wolfgang Ertel, Markus Schneider

Published in: KI 2021: Advances in Artificial Intelligence

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This chapter delves into the crucial role of data representation in machine learning, emphasizing the importance of shift-invariant feature extraction techniques. It introduces circulant maximum variance bases, a novel approach that generalizes principal component analysis (PCA) and dynamic PCA (DPCA) by requiring shift-invariance. The chapter explores the mathematical formulation of this optimization problem, which allows for a closed-form solution and better understanding of the results. It also discusses the relationship between circulant matrices, FIR filters, and integral transforms like Fourier analysis. The theory is illustrated with numerical results on stationary stochastic processes, demonstrating the effectiveness of the proposed framework in data-adaptive time-frequency decomposition. This chapter offers a comprehensive and innovative perspective on data representation in machine learning, making it a valuable read for specialists in the field.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
Matched filters are learned in a supervised setting, while here we restrict ourselves to the unsupervised case. Hence, the “matching” of the filter coefficients is according to a variance criterion (similar to PCA).
 
2
Principal component analysis is almost equivalent to the Karhunen-Loève transform (KLT) [14]. Further information regarding the relationship between PCA and KLT is given in [10].
 
3
The dot product \(\mathbf {u}^T\mathbf {x}\) serves as measure of similarity.
 
4
The discrete circular convolution of two sequences \(\mathbf {x},\mathbf {y}\in \mathbb {R}^{D}\) is written as \(\mathbf {x}\circledast \mathbf {y}\), while the linear convolution is written as \(\mathbf {x}*\mathbf {y}\).
 
5
Due to the constraint \(\left\Vert \mathbf {g}\right\Vert _2^2=1\) this is not trivial.
 
6
This interpretation is only valid under the assumptions mentioned in Sect. 3.3. Furthermore, the normalization of the autocorrelation (autocovariance) is to be performed as \(\mathbf {r}' = \frac{\mathbf {r}}{r_0}\), with the first component \(r_0\) of \(\mathbf {r}\) being the variance [16].
 
7
Let \(\mathbf {Y}=\mathbf {G}_\kappa \mathbf {X}\). Maximizing \(\left\Vert \mathbf {Y}\right\Vert _F^2\) (cf. Eq. 19) means maximizing the trace of the covariance matrix \(\mathbf {S}\propto \mathbf {Y}\mathbf {Y}^T\), which in turn is a measure for the total dispersion [18].
 
Literature
1.
go back to reference Albawi, S., Mohammed, T.A., Al-Zawi, S.: Understanding of a convolutional neural network. In: 2017 International Conference on Engineering and Technology (ICET), pp. 1–6. IEEE (2017) Albawi, S., Mohammed, T.A., Al-Zawi, S.: Understanding of a convolutional neural network. In: 2017 International Conference on Engineering and Technology (ICET), pp. 1–6. IEEE (2017)
3.
go back to reference Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRef Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRef
4.
go back to reference Bose, A., Saha, K.: Random Circulant Matrices. CRC Press (2018) Bose, A., Saha, K.: Random Circulant Matrices. CRC Press (2018)
5.
go back to reference Casazza, P.G., Kutyniok, G., Philipp, F.: Introduction to finite frame theory. Finite Frames, pp. 1–53 (2013) Casazza, P.G., Kutyniok, G., Philipp, F.: Introduction to finite frame theory. Finite Frames, pp. 1–53 (2013)
6.
go back to reference Chatfield, C.: The Analysis of Time Series: An Introduction. Chapman and Hall/CRC (2003) Chatfield, C.: The Analysis of Time Series: An Introduction. Chapman and Hall/CRC (2003)
8.
go back to reference Fulcher, B.D.: Feature-based time-series analysis. In: Feature Engineering for Machine Learning and Data Analytics, pp. 87–116. CRC Press (2018) Fulcher, B.D.: Feature-based time-series analysis. In: Feature Engineering for Machine Learning and Data Analytics, pp. 87–116. CRC Press (2018)
9.
go back to reference Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)MathSciNetCrossRef Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)MathSciNetCrossRef
10.
11.
go back to reference Gray, R.M.: Toeplitz and Circulant Matrices: A Review (2006) Gray, R.M.: Toeplitz and Circulant Matrices: A Review (2006)
13.
go back to reference Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)CrossRef Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)CrossRef
14.
go back to reference Orfanidis, S.: SVD, PCA, KLT, CCA, and all that. Optimum Signal Processing, pp. 332–525 (2007) Orfanidis, S.: SVD, PCA, KLT, CCA, and all that. Optimum Signal Processing, pp. 332–525 (2007)
15.
go back to reference Papyan, V., Romano, Y., Elad, M.: Convolutional neural networks analyzed via convolutional sparse coding. J. Mach. Learn. Res. 18(1), 2887–2938 (2017)MathSciNetMATH Papyan, V., Romano, Y., Elad, M.: Convolutional neural networks analyzed via convolutional sparse coding. J. Mach. Learn. Res. 18(1), 2887–2938 (2017)MathSciNetMATH
16.
go back to reference Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of Time Series Analysis, Signal Processing, and Dynamics. Elsevier (1999) Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of Time Series Analysis, Signal Processing, and Dynamics. Elsevier (1999)
17.
go back to reference Rusu, C.: On learning with shift-invariant structures. Digit. Signal Process. 99, 102654 (2020)CrossRef Rusu, C.: On learning with shift-invariant structures. Digit. Signal Process. 99, 102654 (2020)CrossRef
18.
go back to reference Seber, G.A.: Multivariate Observations, vol. 252. Wiley, Hoboken (2009)MATH Seber, G.A.: Multivariate Observations, vol. 252. Wiley, Hoboken (2009)MATH
19.
go back to reference Strang, G., Nguyen, T.: Wavelets and Filter Banks. SIAM (1996) Strang, G., Nguyen, T.: Wavelets and Filter Banks. SIAM (1996)
20.
go back to reference Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Process. Mag. 28(2), 27–38 (2011)CrossRef Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Process. Mag. 28(2), 27–38 (2011)CrossRef
21.
go back to reference Unser, M.: On the approximation of the discrete Karhunen-Loeve transform for stationary processes. Signal Process. 7(3), 231–249 (1984)MathSciNetCrossRef Unser, M.: On the approximation of the discrete Karhunen-Loeve transform for stationary processes. Signal Process. 7(3), 231–249 (1984)MathSciNetCrossRef
22.
go back to reference Vaswani, N., Narayanamurthy, P.: Static and dynamic robust PCA and matrix completion: a review. Proc. IEEE 106(8), 1359–1379 (2018)CrossRef Vaswani, N., Narayanamurthy, P.: Static and dynamic robust PCA and matrix completion: a review. Proc. IEEE 106(8), 1359–1379 (2018)CrossRef
23.
go back to reference Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of Signal Processing. Cambridge University Press (2014) Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of Signal Processing. Cambridge University Press (2014)
24.
go back to reference Zhao, D., Lin, Z., Tang, X.: Laplacian PCA and its applications. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007) Zhao, D., Lin, Z., Tang, X.: Laplacian PCA and its applications. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007)
Metadata
Title
-Circulant Maximum Variance Bases
Authors
Christopher Bonenberger
Wolfgang Ertel
Markus Schneider
Copyright Year
2021
DOI
https://doi.org/10.1007/978-3-030-87626-5_2

Premium Partner