2014 | OriginalPaper | Buchkapitel
Unsupervised Dimensionality Reduction for Gaussian Mixture Model
verfasst von : Xi Yang, Kaizhu Huang, Rui Zhang
Erschienen in: Neural Information Processing
Verlag: Springer International Publishing
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Dimensionality reduction is a fundamental yet active research topic in pattern recognition and machine learning. On the other hand, Gaussian Mixture Model (GMM), a famous model, has been widely used in various applications, e.g., clustering and classification. For high-dimensional data, previous research usually performs dimensionality reduction first, and then inputs the reduced features to other available models, e.g., GMM. In particular, there are very few investigations or discussions on how dimensionality reduction could be interactively and systematically conducted together with the important GMM. In this paper, we study the problem how unsupervised dimensionality reduction could be performed together with GMM and if such joint learning could lead to improvement in comparison with the traditional unsupervised method. Specifically, we engage the Mixture of Factor Analyzers with the assumption that a common factor loading exist for all the components. Such setting exactly optimizes a dimensionality reduction together with the parameters of GMM. We compare the joint learning approach and the separate dimensionality reduction plus GMM method on both synthetic data and real data sets. Experimental results show that the joint learning significantly outperforms the comparison method in terms of three criteria for supervised learning.