Skip to main content
Erschienen in: Journal of Classification 1/2021

11.02.2020

Modified Subspace Constrained Mean Shift Algorithm

verfasst von: Youness Aliyari Ghassabeh, Frank Rudzicz

Erschienen in: Journal of Classification | Ausgabe 1/2021

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

A subspace constrained mean shift (SCMS) algorithm is a non-parametric iterative technique to estimate principal curves. Principal curves, as a nonlinear generalization of principal components analysis (PCA), are smooth curves (or surfaces) that pass through the middle of a data set and provide a compact low-dimensional representation of data. The SCMS algorithm combines the mean shift (MS) algorithm with a projection step to estimate principal curves and surfaces. The MS algorithm is a simple iterative method for locating modes of an unknown probability density function (pdf) obtained via a kernel density estimate. Modes of a pdf can be interpreted as zero-dimensional principal curves. These modes also can be used for clustering the input data. The SCMS algorithm generalizes the MS algorithm to estimate higher order principal curves and surfaces. Although both algorithms have been widely used in many real-world applications, their convergence for widely used kernels (e.g., Gaussian kernel) has not been sown yet. In this paper, we first introduce a modified version of the MS algorithm and then combine it with different variations of the SCMS algorithm to estimate the underlying low-dimensional principal curve, embedded in a high-dimensional space. The different variations of the SCMS algorithm are obtained via modification of the projection step in the original SCMS algorithm. We show that the modification of the MS algorithm guarantees its convergence and also implies the convergence of different variations of the SCMS algorithm. The performance and effectiveness of the proposed modified versions to successfully estimate an underlying principal curve was shown through simulations using the synthetic data.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
It is called a principal curve for D = 1, and becomes a mode of the pdf for D = 0
 
2
Note that for the special case of Gaussian distribution \(\hat {f}\sim N(\mu ,\boldsymbol {\Sigma })\), the local inverse covariance matrix in Eq. 6 becomes equal to the inverse covariance matrix, i.e., \(\hat {\boldsymbol {\Sigma }}^{-1}(\boldsymbol {x})=\boldsymbol {\Sigma }^{-1}\).
 
3
The equality ∥yj+ 1yj2 = 0 happens only when the convergence occurs that Theorem 1 guarantees it for the modified MS algorithm.
 
Literatur
Zurück zum Zitat Arias-Castro, E., Mason, D., Pelletier, B. (2016). Errata: on the estimation of the gradient lines of a density and the consistency of the mean-shift algorithm. Journal of Machine Learning Research, 17, 14.MATH Arias-Castro, E., Mason, D., Pelletier, B. (2016). Errata: on the estimation of the gradient lines of a density and the consistency of the mean-shift algorithm. Journal of Machine Learning Research, 17, 14.MATH
Zurück zum Zitat Banfield, J. D., & Raftery, A. E. (1992). Ice floe identification in satellite images using mathematical morphology and clustering about principal curves. Journal of the American Statistical Association, 87, 7–16.CrossRef Banfield, J. D., & Raftery, A. E. (1992). Ice floe identification in satellite images using mathematical morphology and clustering about principal curves. Journal of the American Statistical Association, 87, 7–16.CrossRef
Zurück zum Zitat Biau, G., & Fischer, A. (2012). Parameter selection for principal curves. IEEE Trans. on Information Theory, 58, 1924–1939.MathSciNetCrossRef Biau, G., & Fischer, A. (2012). Parameter selection for principal curves. IEEE Trans. on Information Theory, 58, 1924–1939.MathSciNetCrossRef
Zurück zum Zitat Carreira-Perpiñán, M. A. (2007). Gaussian mean shift is an eM algorithm. IEEE Trans. on Pattern Analysis and Machine Intelligence, 29, 767–776.CrossRef Carreira-Perpiñán, M. A. (2007). Gaussian mean shift is an eM algorithm. IEEE Trans. on Pattern Analysis and Machine Intelligence, 29, 767–776.CrossRef
Zurück zum Zitat Chang, K. Y., & Gosh, A. (2001). A unified model for probabilistic principal surfaces. IEEE Trans. on Pattern Analysis and Machine Intelligence, 23, 22–41.CrossRef Chang, K. Y., & Gosh, A. (2001). A unified model for probabilistic principal surfaces. IEEE Trans. on Pattern Analysis and Machine Intelligence, 23, 22–41.CrossRef
Zurück zum Zitat Cheng, Y. (1995). Mean shift, mode seeking and clustering. IEEE Trans. on Pattern Analysis and Machine Intelligence, 17, 790–799.CrossRef Cheng, Y. (1995). Mean shift, mode seeking and clustering. IEEE Trans. on Pattern Analysis and Machine Intelligence, 17, 790–799.CrossRef
Zurück zum Zitat Comaniciu, D., & Meer, P. (2002). Mean shift: a robust approach toward feature space analysis. IEEE Trans. on Pattern Analysis and Machine Intelligence, 24, 603–619.CrossRef Comaniciu, D., & Meer, P. (2002). Mean shift: a robust approach toward feature space analysis. IEEE Trans. on Pattern Analysis and Machine Intelligence, 24, 603–619.CrossRef
Zurück zum Zitat Comaniciu, D., Ramesh, V., Meer, P. (2000). Real-time tracking of non-rigid objects using mean shift. In Proceedings IEEE Conference on Computer Vision and Pattern Recognition (CVPR2000) (pp. 142–149). USA: Hilton Head Island. Comaniciu, D., Ramesh, V., Meer, P. (2000). Real-time tracking of non-rigid objects using mean shift. In Proceedings IEEE Conference on Computer Vision and Pattern Recognition (CVPR2000) (pp. 142–149). USA: Hilton Head Island.
Zurück zum Zitat Delicado, P. (2001). Another look at principal curves and surfaces. Journal of Multivariate Analysis, 77, 84–116.MathSciNetCrossRef Delicado, P. (2001). Another look at principal curves and surfaces. Journal of Multivariate Analysis, 77, 84–116.MathSciNetCrossRef
Zurück zum Zitat Fashing, M., & Tomasi, C. (2005). Mean shift is a bound optimization. IEEE Trans. on Pattern Analysis and Machine Intelligence, 27, 471–474.CrossRef Fashing, M., & Tomasi, C. (2005). Mean shift is a bound optimization. IEEE Trans. on Pattern Analysis and Machine Intelligence, 27, 471–474.CrossRef
Zurück zum Zitat Fukunaga, K., & Hostetler, L. D. (1975). Estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. on Inform. Theory, 21, 32–40.MathSciNetCrossRef Fukunaga, K., & Hostetler, L. D. (1975). Estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. on Inform. Theory, 21, 32–40.MathSciNetCrossRef
Zurück zum Zitat Ghassabeh, Y. A. (2015). Asymptotic stability of equilibrium points of mean shift algorithm. Machine Learning, 98, 359–368.MathSciNetCrossRef Ghassabeh, Y. A. (2015). Asymptotic stability of equilibrium points of mean shift algorithm. Machine Learning, 98, 359–368.MathSciNetCrossRef
Zurück zum Zitat Ghassabeh, Y. A. (2016). A sufficient condition for the convergence of the mean shift algorithm with Gaussian kernel. Journal of Multivariate Analysis, 135, 1–10.MathSciNetCrossRef Ghassabeh, Y. A. (2016). A sufficient condition for the convergence of the mean shift algorithm with Gaussian kernel. Journal of Multivariate Analysis, 135, 1–10.MathSciNetCrossRef
Zurück zum Zitat Ghassabeh, Y. A., Linder, T., Takahara, G. (2012a). On noisy source vector quantization via a subspace constrained mean shift algorithm. In Proceedings of the 26th Biennial Symp. on Communications (pp. 107–110). Canada: Kingston. Ghassabeh, Y. A., Linder, T., Takahara, G. (2012a). On noisy source vector quantization via a subspace constrained mean shift algorithm. In Proceedings of the 26th Biennial Symp. on Communications (pp. 107–110). Canada: Kingston.
Zurück zum Zitat Ghassabeh, Y. A., Linder, T., Takahara, G. (2012b). On the convergence and applications of mean shift type algorithms. In: Proceedings of 25th IEEE Canadian Conference on Electrical & Computer Engineering (CCECE). Montreal, Canada, pp. 1-5. Ghassabeh, Y. A., Linder, T., Takahara, G. (2012b). On the convergence and applications of mean shift type algorithms. In: Proceedings of 25th IEEE Canadian Conference on Electrical & Computer Engineering (CCECE). Montreal, Canada, pp. 1-5.
Zurück zum Zitat Ghassabeh, Y. A., Linder, T., Takahara, G. (2013). On some convergence properties of the subspace constrained mean shift. Pattern Recognition, 46, 3140–3147.CrossRef Ghassabeh, Y. A., Linder, T., Takahara, G. (2013). On some convergence properties of the subspace constrained mean shift. Pattern Recognition, 46, 3140–3147.CrossRef
Zurück zum Zitat Ghassabeh, Y. A., & Rudzicz, F. (2018). Modified mean shift algorithm. IET Image Processing, 12, 2172–2177.CrossRef Ghassabeh, Y. A., & Rudzicz, F. (2018). Modified mean shift algorithm. IET Image Processing, 12, 2172–2177.CrossRef
Zurück zum Zitat Hastie, T., & Stuetzle, W. (1989). Principal curves. Journal of the American Statistical Association, 84, 502–516.MathSciNetCrossRef Hastie, T., & Stuetzle, W. (1989). Principal curves. Journal of the American Statistical Association, 84, 502–516.MathSciNetCrossRef
Zurück zum Zitat Jolliffe, I.T. (2002). Principal component analysis, 1st edn. New York: Springer-Verlag.MATH Jolliffe, I.T. (2002). Principal component analysis, 1st edn. New York: Springer-Verlag.MATH
Zurück zum Zitat Kegl, B., Krzyzak, A., Linder, T., Zeger, K. (2000). Learning and design of principal curves. IEEE Trans. on Pattern Analysis and Machine Intelligence, 22, 281–297.CrossRef Kegl, B., Krzyzak, A., Linder, T., Zeger, K. (2000). Learning and design of principal curves. IEEE Trans. on Pattern Analysis and Machine Intelligence, 22, 281–297.CrossRef
Zurück zum Zitat Li, X., Hu, Z., Wu, F. (2007). A note on the convergence of the mean shift. Pattern Recognition, 40, 1756–1762.CrossRef Li, X., Hu, Z., Wu, F. (2007). A note on the convergence of the mean shift. Pattern Recognition, 40, 1756–1762.CrossRef
Zurück zum Zitat Ozertem, U., & Erdogmus, D. (2011). Locally defined principal curves and surfaces. Journal of Machine Learning Research, 12, 1249–1286.MathSciNetMATH Ozertem, U., & Erdogmus, D. (2011). Locally defined principal curves and surfaces. Journal of Machine Learning Research, 12, 1249–1286.MathSciNetMATH
Zurück zum Zitat Silverman, B.W. (1986). Density estimation for statistics and data analysis, 1st edn. New York: Chapman and Hall.CrossRef Silverman, B.W. (1986). Density estimation for statistics and data analysis, 1st edn. New York: Chapman and Hall.CrossRef
Zurück zum Zitat Tao, W., Jin, H., Zhang, Y. (2007). Color image segmentation based on mean shift and normalized cuts. IEEE Trans. on Systems, Man, and Cybernetics Part B: Cybernetics, 37, 1382–1389.CrossRef Tao, W., Jin, H., Zhang, Y. (2007). Color image segmentation based on mean shift and normalized cuts. IEEE Trans. on Systems, Man, and Cybernetics Part B: Cybernetics, 37, 1382–1389.CrossRef
Zurück zum Zitat Tibshirani, R. (1992). Principal curves revisited. Statistics and Computation, 2, 183–190.CrossRef Tibshirani, R. (1992). Principal curves revisited. Statistics and Computation, 2, 183–190.CrossRef
Zurück zum Zitat Wand, M. P., & Jones, M. (1995). Kernel smoothing. London: Chapman and Hall.CrossRef Wand, M. P., & Jones, M. (1995). Kernel smoothing. London: Chapman and Hall.CrossRef
Zurück zum Zitat Wu, C. F. J. (1982). On the convergence properties of the eM algorithm. The Annals of Statistics, 11, 95103.MathSciNet Wu, C. F. J. (1982). On the convergence properties of the eM algorithm. The Annals of Statistics, 11, 95103.MathSciNet
Zurück zum Zitat Yuan, X. T., Hu, B. G., He, R. (2012). Agglomerative mean-shift clustering. IEEE Trans. on Knowledge and Data Engineering, 24, 209–219.CrossRef Yuan, X. T., Hu, B. G., He, R. (2012). Agglomerative mean-shift clustering. IEEE Trans. on Knowledge and Data Engineering, 24, 209–219.CrossRef
Metadaten
Titel
Modified Subspace Constrained Mean Shift Algorithm
verfasst von
Youness Aliyari Ghassabeh
Frank Rudzicz
Publikationsdatum
11.02.2020
Verlag
Springer US
Erschienen in
Journal of Classification / Ausgabe 1/2021
Print ISSN: 0176-4268
Elektronische ISSN: 1432-1343
DOI
https://doi.org/10.1007/s00357-019-09353-1

Weitere Artikel der Ausgabe 1/2021

Journal of Classification 1/2021 Zur Ausgabe