Skip to main content

2021 | OriginalPaper | Buchkapitel

\(\kappa \)-Circulant Maximum Variance Bases

verfasst von : Christopher Bonenberger, Wolfgang Ertel, Markus Schneider

Erschienen in: KI 2021: Advances in Artificial Intelligence

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Principal component analysis (PCA), a well-known technique in machine learning and statistics, is typically applied to time-independent data, as it is based on point-wise correlations. Dynamic PCA (DPCA) handles this issue by augmenting the data set with lagged versions of itself. In this paper, we show that both, PCA and DPCA, are a special case of \(\kappa \)-circulant maximum variance bases. We formulate the constrained linear optimization problem of finding such \(\kappa \)-circulant bases and present a closed-form solution that allows further interpretation and significant speed-up for DPCA. Furthermore, the relation of the proposed bases to the discrete Fourier transform, finite impulse response filters as well as spectral density estimation is pointed out.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Matched filters are learned in a supervised setting, while here we restrict ourselves to the unsupervised case. Hence, the “matching” of the filter coefficients is according to a variance criterion (similar to PCA).
 
2
Principal component analysis is almost equivalent to the Karhunen-Loève transform (KLT) [14]. Further information regarding the relationship between PCA and KLT is given in [10].
 
3
The dot product \(\mathbf {u}^T\mathbf {x}\) serves as measure of similarity.
 
4
The discrete circular convolution of two sequences \(\mathbf {x},\mathbf {y}\in \mathbb {R}^{D}\) is written as \(\mathbf {x}\circledast \mathbf {y}\), while the linear convolution is written as \(\mathbf {x}*\mathbf {y}\).
 
5
Due to the constraint \(\left\Vert \mathbf {g}\right\Vert _2^2=1\) this is not trivial.
 
6
This interpretation is only valid under the assumptions mentioned in Sect. 3.3. Furthermore, the normalization of the autocorrelation (autocovariance) is to be performed as \(\mathbf {r}' = \frac{\mathbf {r}}{r_0}\), with the first component \(r_0\) of \(\mathbf {r}\) being the variance [16].
 
7
Let \(\mathbf {Y}=\mathbf {G}_\kappa \mathbf {X}\). Maximizing \(\left\Vert \mathbf {Y}\right\Vert _F^2\) (cf. Eq. 19) means maximizing the trace of the covariance matrix \(\mathbf {S}\propto \mathbf {Y}\mathbf {Y}^T\), which in turn is a measure for the total dispersion [18].
 
Literatur
1.
Zurück zum Zitat Albawi, S., Mohammed, T.A., Al-Zawi, S.: Understanding of a convolutional neural network. In: 2017 International Conference on Engineering and Technology (ICET), pp. 1–6. IEEE (2017) Albawi, S., Mohammed, T.A., Al-Zawi, S.: Understanding of a convolutional neural network. In: 2017 International Conference on Engineering and Technology (ICET), pp. 1–6. IEEE (2017)
3.
Zurück zum Zitat Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRef Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)CrossRef
4.
Zurück zum Zitat Bose, A., Saha, K.: Random Circulant Matrices. CRC Press (2018) Bose, A., Saha, K.: Random Circulant Matrices. CRC Press (2018)
5.
Zurück zum Zitat Casazza, P.G., Kutyniok, G., Philipp, F.: Introduction to finite frame theory. Finite Frames, pp. 1–53 (2013) Casazza, P.G., Kutyniok, G., Philipp, F.: Introduction to finite frame theory. Finite Frames, pp. 1–53 (2013)
6.
Zurück zum Zitat Chatfield, C.: The Analysis of Time Series: An Introduction. Chapman and Hall/CRC (2003) Chatfield, C.: The Analysis of Time Series: An Introduction. Chapman and Hall/CRC (2003)
8.
Zurück zum Zitat Fulcher, B.D.: Feature-based time-series analysis. In: Feature Engineering for Machine Learning and Data Analytics, pp. 87–116. CRC Press (2018) Fulcher, B.D.: Feature-based time-series analysis. In: Feature Engineering for Machine Learning and Data Analytics, pp. 87–116. CRC Press (2018)
9.
Zurück zum Zitat Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)MathSciNetCrossRef Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)MathSciNetCrossRef
10.
11.
Zurück zum Zitat Gray, R.M.: Toeplitz and Circulant Matrices: A Review (2006) Gray, R.M.: Toeplitz and Circulant Matrices: A Review (2006)
13.
Zurück zum Zitat Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)CrossRef Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)CrossRef
14.
Zurück zum Zitat Orfanidis, S.: SVD, PCA, KLT, CCA, and all that. Optimum Signal Processing, pp. 332–525 (2007) Orfanidis, S.: SVD, PCA, KLT, CCA, and all that. Optimum Signal Processing, pp. 332–525 (2007)
15.
Zurück zum Zitat Papyan, V., Romano, Y., Elad, M.: Convolutional neural networks analyzed via convolutional sparse coding. J. Mach. Learn. Res. 18(1), 2887–2938 (2017)MathSciNetMATH Papyan, V., Romano, Y., Elad, M.: Convolutional neural networks analyzed via convolutional sparse coding. J. Mach. Learn. Res. 18(1), 2887–2938 (2017)MathSciNetMATH
16.
Zurück zum Zitat Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of Time Series Analysis, Signal Processing, and Dynamics. Elsevier (1999) Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of Time Series Analysis, Signal Processing, and Dynamics. Elsevier (1999)
17.
Zurück zum Zitat Rusu, C.: On learning with shift-invariant structures. Digit. Signal Process. 99, 102654 (2020)CrossRef Rusu, C.: On learning with shift-invariant structures. Digit. Signal Process. 99, 102654 (2020)CrossRef
18.
Zurück zum Zitat Seber, G.A.: Multivariate Observations, vol. 252. Wiley, Hoboken (2009)MATH Seber, G.A.: Multivariate Observations, vol. 252. Wiley, Hoboken (2009)MATH
19.
Zurück zum Zitat Strang, G., Nguyen, T.: Wavelets and Filter Banks. SIAM (1996) Strang, G., Nguyen, T.: Wavelets and Filter Banks. SIAM (1996)
20.
Zurück zum Zitat Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Process. Mag. 28(2), 27–38 (2011)CrossRef Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Process. Mag. 28(2), 27–38 (2011)CrossRef
21.
Zurück zum Zitat Unser, M.: On the approximation of the discrete Karhunen-Loeve transform for stationary processes. Signal Process. 7(3), 231–249 (1984)MathSciNetCrossRef Unser, M.: On the approximation of the discrete Karhunen-Loeve transform for stationary processes. Signal Process. 7(3), 231–249 (1984)MathSciNetCrossRef
22.
Zurück zum Zitat Vaswani, N., Narayanamurthy, P.: Static and dynamic robust PCA and matrix completion: a review. Proc. IEEE 106(8), 1359–1379 (2018)CrossRef Vaswani, N., Narayanamurthy, P.: Static and dynamic robust PCA and matrix completion: a review. Proc. IEEE 106(8), 1359–1379 (2018)CrossRef
23.
Zurück zum Zitat Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of Signal Processing. Cambridge University Press (2014) Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of Signal Processing. Cambridge University Press (2014)
24.
Zurück zum Zitat Zhao, D., Lin, Z., Tang, X.: Laplacian PCA and its applications. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007) Zhao, D., Lin, Z., Tang, X.: Laplacian PCA and its applications. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007)
Metadaten
Titel
-Circulant Maximum Variance Bases
verfasst von
Christopher Bonenberger
Wolfgang Ertel
Markus Schneider
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-87626-5_2