Skip to main content

2016 | OriginalPaper | Buchkapitel

5. Algebraic-Geometric Methods

verfasst von : René Vidal, Yi Ma, S. Shankar Sastry

Erschienen in: Generalized Principal Component Analysis

Verlag: Springer New York

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In this chapter, we consider a generalization of PCA in which the given sample points are drawn from an unknown arrangement of subspaces of unknown and possibly different dimensions.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
For example, in 3D motion segmentation from affine cameras, it is known that the subspaces have dimension at most four (Costeira and Kanade 1998; Kanatani 2001; Vidal and Hartley 2004).
 
2
This requires that P be transversal to each \(S_{i}^{\perp }\), i.e., \(\mbox{ span}\{P,S_{i}^{\perp }\} = \mathbb{R}^{D}\) for every \(i = 1, 2,\ldots,n\). Since n is finite, this transversality condition can be easily satisfied. Furthermore, the set of positions for P that violate the transversality condition is only a zero-measure closed set (Hirsch 1976).
 
3
This requires that all π P (S i ) be transversal to each other in P, which is guaranteed if we require P to be transversal to \(S_{i}^{\perp }\cap S_{i'}^{\perp }\) for \(i,i' = 1, 2,\ldots,n\). All P’s that violate this condition form again only a zero-measure set.
 
4
It is essentially based on Whitney’s classical proof of the fact that every differential manifold can be embedded in a Euclidean space.
 
5
Notice that the minimum number of points needed is N ≥ n, which is linear in the number of groups. We will see in future chapters that this is no longer the case for more general clustering problems.
 
6
However, in some special cases, one can show that this will never occur. For example, when n = 2, the least-squares solution for \(\boldsymbol{c}_{n}\) is c 2 = Var[x], \(c_{1} = E[x^{2}]E[x] - E[x^{3}]\) and \(c_{0} = E[x^{3}]E[x] - E[x^{2}]^{2} \leq 0\); hence \(c_{1}^{2} - 4c_{0}c_{2} \geq 0\), and the two roots of the polynomial \(c_{0}x^{2} + c_{1}x + c_{2}\) are always real.
 
7
We will discuss in the next subsection how to automatically obtain one point per subspace from the data when we generalize this problem to clustering points on hyperplanes.
 
8
Since the subspaces S i are all different from each other, we assume that the normal vectors \(\{\boldsymbol{b}_{i}\}_{i=1}^{n}\) are pairwise linearly independent.
 
9
Except when the chosen line is parallel to one of the hyperplanes, which corresponds to a zero-measure set of lines.
 
10
For example, the squared algebraic distance to \(S_{1} \cup S_{2}\) is \(p_{21}(\boldsymbol{x})^{2} + p_{22}(\boldsymbol{x})^{2} = (x_{1}^{2} + x_{2}^{2})x_{3}^{2}\).
 
11
For example, the squared algebraic distance to S 1 is \(p_{11}(\boldsymbol{x})^{2} + p_{12}(\boldsymbol{x})^{2} = x_{1}^{2} + x_{2}^{2}\).
 
12
In particular, it requires at least d i points from each subspace S i .
 
13
In fact, from discussions in the preceding subsection, we know that the polynomials \(g_{j},j = 1,\ldots,k_{i}\) are products of linear forms that vanish on the remaining n − 1 subspaces.
 
14
This can always be done, except when the chosen line is parallel to one of the subspaces, which corresponds to a zero-measure set of lines.
 
15
The reader is encouraged to verify these facts numerically and do the same for the examples in the rest of this section.
 
16
This is guaranteed by the algebraic sampling theorem in Appendix C.
 
17
To reject the N-lines solution, one can put a cap on the maximum number of groups n max ; and to reject \(\mathbb{R}^{D}\) as the solution, one can simply require that the maximum dimension of every subspace be strictly less than D.
 
18
For example, the inequality \(M_{n}(D) \leq N\) imposes a constraint on the maximum possible number of groups n max .
 
19
Notice that to represent a d-dimensional subspace in a D-dimensional space, we need only specify a basis of d linearly independent vectors for the subspace. We may stack these vectors as rows of a \(d \times D\) matrix. Every nonsingular linear transformation of these vectors spans the same subspace. Thus, without loss of generality, we may assume that the matrix is of the normal form \([I_{d\times d},G]\) where G is a \(d \times (D - d)\) matrix consisting of the so-called Grassmannian coordinates.
 
20
The space of subspace arrangements is topologically compact and closed; hence the minimum effective dimension is always achievable and hence well defined.
 
21
We here adopt the G-AIC criterion only to illustrate the basic ideas. In practice, depending on the problem and application, it is possible that other model selection criteria may be more appropriate.
 
22
In this context, the noise is the difference between the original image and the approximate image (the signal).
 
23
That is, the dimensions of some of the subspaces estimated could be larger than the true ones.
 
24
This is exactly what we would have expected, since the recursive ASC first clusters the data into two planes. Points on the intersection of the two planes get assigned to either plane depending on the random noise. If needed, the points on the ghost line can be merged with the plane by some simple postprocessing.
 
25
That is, an arbitrary number of subspaces of arbitrary dimensions.
 
Literatur
Zurück zum Zitat Belhumeur, P., Hespanda, J., & Kriegeman, D. (1997). Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 711–720.CrossRef Belhumeur, P., Hespanda, J., & Kriegeman, D. (1997). Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 711–720.CrossRef
Zurück zum Zitat Bochnak, J., Coste, M., & Roy, M. F. (1998). Real Algebraic Geometry. New York: Springer. Bochnak, J., Coste, M., & Roy, M. F. (1998). Real Algebraic Geometry. New York: Springer.
Zurück zum Zitat Boult, T., & Brown, L. (1991). Factorization-based segmentation of motions. In IEEE Workshop on Motion Understanding (pp. 179–186). Boult, T., & Brown, L. (1991). Factorization-based segmentation of motions. In IEEE Workshop on Motion Understanding (pp. 179–186).
Zurück zum Zitat Broomhead, D. S., & Kirby, M. (2000). A new approach to dimensionality reduction theory and algorithms. SIAM Journal of Applied Mathematics, 60(6), 2114–2142. Broomhead, D. S., & Kirby, M. (2000). A new approach to dimensionality reduction theory and algorithms. SIAM Journal of Applied Mathematics, 60(6), 2114–2142.
Zurück zum Zitat Campbell, N. (1978). The influence function as an aid in outlier detection in discriminant analysis. Applied Statistics, 27(3), 251–258.CrossRefMATH Campbell, N. (1978). The influence function as an aid in outlier detection in discriminant analysis. Applied Statistics, 27(3), 251–258.CrossRefMATH
Zurück zum Zitat Costeira, J., & Kanade, T. (1998). A multibody factorization method for independently moving objects. International Journal of Computer Vision, 29(3), 159–179.CrossRef Costeira, J., & Kanade, T. (1998). A multibody factorization method for independently moving objects. International Journal of Computer Vision, 29(3), 159–179.CrossRef
Zurück zum Zitat Eisenbud, D. (1996). Commutative algebra: With a view towards algebraic geometry. Graduate texts in mathematics. New York: Springer. Eisenbud, D. (1996). Commutative algebra: With a view towards algebraic geometry. Graduate texts in mathematics. New York: Springer.
Zurück zum Zitat Fischler, M. A., & Bolles, R. C. (1981). RANSAC random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 26, 381–395. Fischler, M. A., & Bolles, R. C. (1981). RANSAC random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 26, 381–395.
Zurück zum Zitat Gnanadesikan, R., & Kettenring, J. (1972). Robust estimates, residuals, and outlier detection with multiresponse data. Biometrics, 28(1), 81–124.CrossRef Gnanadesikan, R., & Kettenring, J. (1972). Robust estimates, residuals, and outlier detection with multiresponse data. Biometrics, 28(1), 81–124.CrossRef
Zurück zum Zitat Hampel, F., Ronchetti, E., Rousseeuw, P., & Stahel, W. (1986). Robust statistics: The approach based on influence functions. New York: Wiley.MATH Hampel, F., Ronchetti, E., Rousseeuw, P., & Stahel, W. (1986). Robust statistics: The approach based on influence functions. New York: Wiley.MATH
Zurück zum Zitat Hampel, F. R. (1974). The influence curve and its role in robust estiamtion. Journal of the American Statistical Association, 69, 383–393. Hampel, F. R. (1974). The influence curve and its role in robust estiamtion. Journal of the American Statistical Association, 69, 383–393.
Zurück zum Zitat Huang, K., Ma, Y., & Vidal, R. (2004). Minimum effective dimension for mixtures of subspaces: A robust GPCA algorithm and its applications. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. II, pp. 631–638). Huang, K., Ma, Y., & Vidal, R. (2004). Minimum effective dimension for mixtures of subspaces: A robust GPCA algorithm and its applications. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. II, pp. 631–638).
Zurück zum Zitat Kanatani, K. (2001). Motion segmentation by subspace separation and model selection. In IEEE International Conference on Computer Vision (Vol. 2, pp. 586–591). Kanatani, K. (2001). Motion segmentation by subspace separation and model selection. In IEEE International Conference on Computer Vision (Vol. 2, pp. 586–591).
Zurück zum Zitat Kanatani, K. (2002). Evaluation and selection of models for motion segmentation. In Asian Conference on Computer Vision (pp. 7–12). Kanatani, K. (2002). Evaluation and selection of models for motion segmentation. In Asian Conference on Computer Vision (pp. 7–12).
Zurück zum Zitat Leonardis, A., Bischof, H., & Maver, J. (2002). Multiple eigenspaces. Pattern Recognition, 35(11), 2613–2627.CrossRefMATH Leonardis, A., Bischof, H., & Maver, J. (2002). Multiple eigenspaces. Pattern Recognition, 35(11), 2613–2627.CrossRefMATH
Zurück zum Zitat Ma, Y., & Vidal, R. (2005). Identification of deterministic switched ARX systems via identification of algebraic varieties. In Hybrid Systems: Computation and Control (pp. 449–465). New York: Springer.CrossRef Ma, Y., & Vidal, R. (2005). Identification of deterministic switched ARX systems via identification of algebraic varieties. In Hybrid Systems: Computation and Control (pp. 449–465). New York: Springer.CrossRef
Zurück zum Zitat Ma, Y., Yang, A. Y., Derksen, H., & Fossum, R. (2008). Estimation of subspace arrangements with applications in modeling and segmenting mixed data. SIAM Review, 50(3), 413–458. Ma, Y., Yang, A. Y., Derksen, H., & Fossum, R. (2008). Estimation of subspace arrangements with applications in modeling and segmenting mixed data. SIAM Review, 50(3), 413–458.
Zurück zum Zitat Overschee, P. V., & Moor, B. D. (1993). Subspace algorithms for the stochastic identification problem. Automatica, 29(3), 649–660. Overschee, P. V., & Moor, B. D. (1993). Subspace algorithms for the stochastic identification problem. Automatica, 29(3), 649–660.
Zurück zum Zitat Rao, S., Yang, A. Y., Wagner, A., & Ma, Y. (2005). Segmentation of hybrid motions via hybrid quadratic surface analysis. In IEEE International Conference on Computer Vision (pp. 2–9). Rao, S., Yang, A. Y., Wagner, A., & Ma, Y. (2005). Segmentation of hybrid motions via hybrid quadratic surface analysis. In IEEE International Conference on Computer Vision (pp. 2–9).
Zurück zum Zitat Schindler, K., & Suter, D. (2005). Two-view multibody structure-and-motion with outliers. In IEEE Conference on Computer Vision and Pattern Recognition. Schindler, K., & Suter, D. (2005). Two-view multibody structure-and-motion with outliers. In IEEE Conference on Computer Vision and Pattern Recognition.
Zurück zum Zitat Shizawa, M., & Mase, K. (1991). A unified computational theory for motion transparency and motion boundaries based on eigenenergy analysis. In IEEE Conference on Computer Vision and Pattern Recognition (pp. 289–295). Shizawa, M., & Mase, K. (1991). A unified computational theory for motion transparency and motion boundaries based on eigenenergy analysis. In IEEE Conference on Computer Vision and Pattern Recognition (pp. 289–295).
Zurück zum Zitat Steward, C. V. (1999). Robust parameter estimation in computer vision. SIAM Review, 41(3), 513–537. Steward, C. V. (1999). Robust parameter estimation in computer vision. SIAM Review, 41(3), 513–537.
Zurück zum Zitat Taubin, G. (1991). Estimation of planar curves, surfaces, and nonplanar space curves defined by implicit equations with applications to edge and range image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13(11), 1115–1138.CrossRef Taubin, G. (1991). Estimation of planar curves, surfaces, and nonplanar space curves defined by implicit equations with applications to edge and range image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13(11), 1115–1138.CrossRef
Zurück zum Zitat Tipping, M., & Bishop, C. (1999a). Mixtures of probabilistic principal component analyzers. Neural Computation, 11(2), 443–482. Tipping, M., & Bishop, C. (1999a). Mixtures of probabilistic principal component analyzers. Neural Computation, 11(2), 443–482.
Zurück zum Zitat Torr, P., & Davidson, C. (2003). IMPSAC: Synthesis of importance sampling and random sample consensus. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(3), 354–364.CrossRef Torr, P., & Davidson, C. (2003). IMPSAC: Synthesis of importance sampling and random sample consensus. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(3), 354–364.CrossRef
Zurück zum Zitat Torr, P. H. S. (1998). Geometric motion segmentation and model selection. Philosophical Transactions of the Royal Society of London, 356(1740), 1321–1340. Torr, P. H. S. (1998). Geometric motion segmentation and model selection. Philosophical Transactions of the Royal Society of London, 356(1740), 1321–1340.
Zurück zum Zitat Vasilescu, M., & Terzopoulos, D. (2002). Multilinear analysis of image ensembles: Tensorfaces. In Proceedings of European Conference on Computer Vision (pp. 447–460). Vasilescu, M., & Terzopoulos, D. (2002). Multilinear analysis of image ensembles: Tensorfaces. In Proceedings of European Conference on Computer Vision (pp. 447–460).
Zurück zum Zitat Vidal, R., & Hartley, R. (2004). Motion segmentation with missing data by PowerFactorization and Generalized PCA. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. II, pp. 310–316). Vidal, R., & Hartley, R. (2004). Motion segmentation with missing data by PowerFactorization and Generalized PCA. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. II, pp. 310–316).
Zurück zum Zitat Vidal, R., & Ma, Y. (2004). A unified algebraic approach to 2-D and 3-D motion segmentation. In European Conference on Computer Vision (pp. 1–15). Vidal, R., & Ma, Y. (2004). A unified algebraic approach to 2-D and 3-D motion segmentation. In European Conference on Computer Vision (pp. 1–15).
Zurück zum Zitat Vidal, R., Ma, Y., & Piazzi, J. (2004). A new GPCA algorithm for clustering subspaces by fitting, differentiating and dividing polynomials. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. I, pp. 510–517). Vidal, R., Ma, Y., & Piazzi, J. (2004). A new GPCA algorithm for clustering subspaces by fitting, differentiating and dividing polynomials. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. I, pp. 510–517).
Zurück zum Zitat Vidal, R., Ma, Y., & Sastry, S. (2003b). Generalized Principal Component Analysis (GPCA). In IEEE Conference on Computer Vision and Pattern Recognition (Vol. I, pp. 621–628). Vidal, R., Ma, Y., & Sastry, S. (2003b). Generalized Principal Component Analysis (GPCA). In IEEE Conference on Computer Vision and Pattern Recognition (Vol. I, pp. 621–628).
Zurück zum Zitat Wu, Y., Zhang, Z., Huang, T., & Lin, J. (2001). Multibody grouping via orthogonal subspace decomposition. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. 2, pp. 252–257). Wu, Y., Zhang, Z., Huang, T., & Lin, J. (2001). Multibody grouping via orthogonal subspace decomposition. In IEEE Conference on Computer Vision and Pattern Recognition (Vol. 2, pp. 252–257).
Zurück zum Zitat Yang, A. Y., Rao, S. R., & Ma, Y. (2006). Robust statistical estimation and segmentation of multiple subspaces. In CVPR workshop on 25 years of RANSAC. Yang, A. Y., Rao, S. R., & Ma, Y. (2006). Robust statistical estimation and segmentation of multiple subspaces. In CVPR workshop on 25 years of RANSAC.
Metadaten
Titel
Algebraic-Geometric Methods
verfasst von
René Vidal
Yi Ma
S. Shankar Sastry
Copyright-Jahr
2016
Verlag
Springer New York
DOI
https://doi.org/10.1007/978-0-387-87811-9_5

Neuer Inhalt