Skip to main content
Erschienen in: Artificial Intelligence Review 3/2019

25.09.2017

A survey of feature selection methods for Gaussian mixture models and hidden Markov models

verfasst von: Stephen Adams, Peter A. Beling

Erschienen in: Artificial Intelligence Review | Ausgabe 3/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Feature selection is the process of reducing the number of collected features to a relevant subset of features and is often used to combat the curse of dimensionality. This paper provides a review of the literature on feature selection techniques specifically designed for Gaussian mixture models (GMMs) and hidden Markov models (HMMs), two common parametric latent variable models. The primary contribution of this work is the collection and grouping of feature selection methods specifically designed for GMMs and for HMMs. An additional contribution lies in outlining the connections between these two groups of feature selection methods. Often, feature selection methods for GMMs and HMMs are treated as separate topics. In this survey, we propose that methods developed for one model can be adapted to the other model. Further, we find that the number of feature selection methods for GMMs outweighs the number of methods for HMMs and that the proportion of methods for HMMs that require supervised data is larger than the proportion of GMM methods that require supervised data. We conclude that further research into unsupervised feature selection methods for HMMs is required and that established methods for GMMs could be adapted to HMMs. It should be noted that feature selection can also be referred to as dimensionality reduction, variable selection, attribute selection, and variable subset reduction. In this paper, we make a distinction between dimensionality reduction and feature selection. Dimensionality reduction, which we do not consider, is any process that reduces the number of features used in a model and can include methods that transform features in order to reduce the dimensionality. Feature selection, by contrast, is a specific form of dimensionality reduction that eliminates feature as inputs into the model. The primary difference is that dimensionality reduction can still require the collection of all the data sources in order to transform and reduce the feature set, while feature selection eliminates the need to collect the irrelevant data sources.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Adams S, Beling PA, Cogill R (2016) Feature selection for hidden Markov models and hidden semi-Markov models. IEEE Access 4:1642–1657CrossRef Adams S, Beling PA, Cogill R (2016) Feature selection for hidden Markov models and hidden semi-Markov models. IEEE Access 4:1642–1657CrossRef
Zurück zum Zitat Aha DW, Bankert RL (1995) A comparative evaluation of sequential feature selection algorithms. In: Proceedings of the fifth international workshop on artificial intelligence and statistics Aha DW, Bankert RL (1995) A comparative evaluation of sequential feature selection algorithms. In: Proceedings of the fifth international workshop on artificial intelligence and statistics
Zurück zum Zitat Allili MS, Bouguila N, Ziou D (2008) Finite general Gaussian mixture modeling and application to image and video foreground segmentation. J Electron Imaging 17(1):013,005–013,005CrossRef Allili MS, Bouguila N, Ziou D (2008) Finite general Gaussian mixture modeling and application to image and video foreground segmentation. J Electron Imaging 17(1):013,005–013,005CrossRef
Zurück zum Zitat Allili MS, Ziou D, Bouguila N, Boutemedjet S (2010) Image and video segmentation by combining unsupervised generalized Gaussian mixture modeling and feature selection. IEEE Trans Circuits Syst Video Technol 20(10):1373–1377CrossRef Allili MS, Ziou D, Bouguila N, Boutemedjet S (2010) Image and video segmentation by combining unsupervised generalized Gaussian mixture modeling and feature selection. IEEE Trans Circuits Syst Video Technol 20(10):1373–1377CrossRef
Zurück zum Zitat Almuallim H, Dietterich TG (1991) Learning with many irrelevant features. In: AAAI, vol 91. Citeseer, pp 547–552 Almuallim H, Dietterich TG (1991) Learning with many irrelevant features. In: AAAI, vol 91. Citeseer, pp 547–552
Zurück zum Zitat Bagos PG, Liakopoulos TD, Hamodrakas SJ (2004) Faster gradient descent training of hidden Markov models, using individual learning rate adaptation. In: International colloquium on grammatical inference. Springer, pp 40–52 Bagos PG, Liakopoulos TD, Hamodrakas SJ (2004) Faster gradient descent training of hidden Markov models, using individual learning rate adaptation. In: International colloquium on grammatical inference. Springer, pp 40–52
Zurück zum Zitat Bahl L, Brown PF, De Souza PV, Mercer RL (1986) Maximum mutual information estimation of hidden Markov model parameters for speech recognition. In: Proceedings of ICASSP, vol 86, pp 49–52 Bahl L, Brown PF, De Souza PV, Mercer RL (1986) Maximum mutual information estimation of hidden Markov model parameters for speech recognition. In: Proceedings of ICASSP, vol 86, pp 49–52
Zurück zum Zitat Bashir FI, Khokhar AA, Schonfeld D (2007) Object trajectory-based activity classification and recognition using hidden Markov models. IEEE Trans Image Process 16(7):1912–1919MathSciNetCrossRef Bashir FI, Khokhar AA, Schonfeld D (2007) Object trajectory-based activity classification and recognition using hidden Markov models. IEEE Trans Image Process 16(7):1912–1919MathSciNetCrossRef
Zurück zum Zitat Bhattacharya S, McNicholas PD (2014) A LASSO-penalized BIC for mixture model selection. Adv Data Anal Classif 8(1):45–61MathSciNetCrossRef Bhattacharya S, McNicholas PD (2014) A LASSO-penalized BIC for mixture model selection. Adv Data Anal Classif 8(1):45–61MathSciNetCrossRef
Zurück zum Zitat Bilmes J (1998) A gentle tutorial of the EM algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models. Int Comput Sci Inst 4(510):126 Bilmes J (1998) A gentle tutorial of the EM algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models. Int Comput Sci Inst 4(510):126
Zurück zum Zitat Bins J, Draper BA (2001) Feature selection from huge feature sets. In: Eighth IEEE international conference on computer vision, 2001. ICCV 2001. Proceedings, vol 2. IEEE, pp 159–165 Bins J, Draper BA (2001) Feature selection from huge feature sets. In: Eighth IEEE international conference on computer vision, 2001. ICCV 2001. Proceedings, vol 2. IEEE, pp 159–165
Zurück zum Zitat Bishop CM, Hinton GE, Strachant IG (1997) GTM through time. In: Proceedings of the IEEE fifth international conference on artificial neural networks. Citeseer Bishop CM, Hinton GE, Strachant IG (1997) GTM through time. In: Proceedings of the IEEE fifth international conference on artificial neural networks. Citeseer
Zurück zum Zitat Bishop CM, Svensén M, Williams CK (1998) GTM: the generative topographic mapping. Neural Comput 10(1):215–234MATHCrossRef Bishop CM, Svensén M, Williams CK (1998) GTM: the generative topographic mapping. Neural Comput 10(1):215–234MATHCrossRef
Zurück zum Zitat Bocchieri E (1993) Vector quantization for the efficient computation of continuous density likelihoods. In: 1993 IEEE international conference on acoustics, speech, and signal processing, 1993. ICASSP-93, vol 2. IEEE, pp 692–695 Bocchieri E (1993) Vector quantization for the efficient computation of continuous density likelihoods. In: 1993 IEEE international conference on acoustics, speech, and signal processing, 1993. ICASSP-93, vol 2. IEEE, pp 692–695
Zurück zum Zitat Boutemedjet S, Bouguila N, Ziou D (2007) Feature selection for non Gaussian mixture models. In: 2007 IEEE workshop on machine learning for signal processing. IEEE, pp 69–74 Boutemedjet S, Bouguila N, Ziou D (2007) Feature selection for non Gaussian mixture models. In: 2007 IEEE workshop on machine learning for signal processing. IEEE, pp 69–74
Zurück zum Zitat Bouveyron C, Brunet C (2012) Simultaneous model-based clustering and visualization in the fisher discriminative subspace. Stat Comput 22(1):301–324MathSciNetMATHCrossRef Bouveyron C, Brunet C (2012) Simultaneous model-based clustering and visualization in the fisher discriminative subspace. Stat Comput 22(1):301–324MathSciNetMATHCrossRef
Zurück zum Zitat Bouveyron C, Brunet-Saumard C (2014) Discriminative variable selection for clustering with the sparse Fisher-EM algorithm. Comput Stat 29(3–4):489–513MathSciNetMATHCrossRef Bouveyron C, Brunet-Saumard C (2014) Discriminative variable selection for clustering with the sparse Fisher-EM algorithm. Comput Stat 29(3–4):489–513MathSciNetMATHCrossRef
Zurück zum Zitat Boys RJ, Henderson DA (2001) A comparison of reversible jump MCMC algorithms for DNA sequence segmentation using hidden Markov models. Comput Sci Stat 33:35–49 Boys RJ, Henderson DA (2001) A comparison of reversible jump MCMC algorithms for DNA sequence segmentation using hidden Markov models. Comput Sci Stat 33:35–49
Zurück zum Zitat Cappé O, Buchoux V, Moulines E (1998) Quasi-Newton method for maximum likelihood estimation of hidden Markov models. In: Proceedings of the 1998 IEEE international conference on acoustics, speech and signal processing, 1998, vol 4. IEEE, pp 2265–2268 Cappé O, Buchoux V, Moulines E (1998) Quasi-Newton method for maximum likelihood estimation of hidden Markov models. In: Proceedings of the 1998 IEEE international conference on acoustics, speech and signal processing, 1998, vol 4. IEEE, pp 2265–2268
Zurück zum Zitat Carbonetto P, De Freitas N, Gustafson P, Thompson N (2003) Bayesian feature weighting for unsupervised learning, with application to object recognition. In: Artificial intelligence and statistics (AI & Statistics’ 03). Society for Artificial Intelligence and Statistics Carbonetto P, De Freitas N, Gustafson P, Thompson N (2003) Bayesian feature weighting for unsupervised learning, with application to object recognition. In: Artificial intelligence and statistics (AI & Statistics’ 03). Society for Artificial Intelligence and Statistics
Zurück zum Zitat Caruana R, Freitag D (1994) Greedy attribute selection. In: ICML. Citeseer, pp 28–36 Caruana R, Freitag D (1994) Greedy attribute selection. In: ICML. Citeseer, pp 28–36
Zurück zum Zitat Caruana R, Freitag D (1994) How useful is relevance? FOCUS 14(8):2 Caruana R, Freitag D (1994) How useful is relevance? FOCUS 14(8):2
Zurück zum Zitat Celeux G, Martin-Magniette ML, Maugis-Rabusseau C, Raftery AE (2014) Comparing model selection and regularization approaches to variable selection in model-based clustering. Journal de la Societe francaise de statistique (2009) 155(2):57MathSciNetMATH Celeux G, Martin-Magniette ML, Maugis-Rabusseau C, Raftery AE (2014) Comparing model selection and regularization approaches to variable selection in model-based clustering. Journal de la Societe francaise de statistique (2009) 155(2):57MathSciNetMATH
Zurück zum Zitat Chang S, Dasgupta N, Carin L (2005) A Bayesian approach to unsupervised feature selection and density estimation using expectation propagation. In: IEEE Computer society conference on computer vision and pattern recognition, 2005. CVPR 2005, vol 2. IEEE, pp 1043–1050 Chang S, Dasgupta N, Carin L (2005) A Bayesian approach to unsupervised feature selection and density estimation using expectation propagation. In: IEEE Computer society conference on computer vision and pattern recognition, 2005. CVPR 2005, vol 2. IEEE, pp 1043–1050
Zurück zum Zitat Charlet D, Jouvet D (1997) Optimizing feature set for speaker verification. In: International conference on audio-and video-based biometric person authentication. Springer, pp 203–210 Charlet D, Jouvet D (1997) Optimizing feature set for speaker verification. In: International conference on audio-and video-based biometric person authentication. Springer, pp 203–210
Zurück zum Zitat Chatzis SP, Kosmopoulos DI (2011) A variational Bayesian methodology for hidden Markov models utilizing Student’s-t mixtures. Pattern Recognit 44(2):295–306MATHCrossRef Chatzis SP, Kosmopoulos DI (2011) A variational Bayesian methodology for hidden Markov models utilizing Student’s-t mixtures. Pattern Recognit 44(2):295–306MATHCrossRef
Zurück zum Zitat Cheung R, Eisenstein B (1978) Feature selection via dynamic programming for text-independent speaker identification. IEEE Trans Acoust Speech Signal Process 26(5):397–403CrossRef Cheung R, Eisenstein B (1978) Feature selection via dynamic programming for text-independent speaker identification. IEEE Trans Acoust Speech Signal Process 26(5):397–403CrossRef
Zurück zum Zitat Cheung Ym (2004) A rival penalized EM algorithm towards maximizing weighted likelihood for density mixture clustering with automatic model selection. In: Proceedings of the 17th international conference on Pattern recognition, 2004. ICPR 2004, vol 4. IEEE, pp 633–636 Cheung Ym (2004) A rival penalized EM algorithm towards maximizing weighted likelihood for density mixture clustering with automatic model selection. In: Proceedings of the 17th international conference on Pattern recognition, 2004. ICPR 2004, vol 4. IEEE, pp 633–636
Zurück zum Zitat Cheung Ym (2005) Maximum weighted likelihood via rival penalized EM for density mixture clustering with automatic model selection. IEEE Trans Knowl Data Eng 17(6):750–761CrossRef Cheung Ym (2005) Maximum weighted likelihood via rival penalized EM for density mixture clustering with automatic model selection. IEEE Trans Knowl Data Eng 17(6):750–761CrossRef
Zurück zum Zitat Consonni G, Marin JM (2007) Mean-field variational approximate Bayesian inference for latent variable models. Comput Stat Data Anal 52(2):790–798MathSciNetMATHCrossRef Consonni G, Marin JM (2007) Mean-field variational approximate Bayesian inference for latent variable models. Comput Stat Data Anal 52(2):790–798MathSciNetMATHCrossRef
Zurück zum Zitat Constantinopoulos C, Titsias MK, Likas A (2006) Bayesian feature and model selection for Gaussian mixture models. IEEE Trans Pattern Anal Mach Intell 28(6):1013–1018CrossRef Constantinopoulos C, Titsias MK, Likas A (2006) Bayesian feature and model selection for Gaussian mixture models. IEEE Trans Pattern Anal Mach Intell 28(6):1013–1018CrossRef
Zurück zum Zitat Corduneanu A, Bishop CM (2001) Variational Bayesian model selection for mixture distributions. In: Artificial intelligence and statistics, vol 2001. Morgan Kaufmann Waltham, MA, pp 27–34 Corduneanu A, Bishop CM (2001) Variational Bayesian model selection for mixture distributions. In: Artificial intelligence and statistics, vol 2001. Morgan Kaufmann Waltham, MA, pp 27–34
Zurück zum Zitat Cover TM, Van Campenhout JM (1977) On the possible orderings in the measurement selection problem. IEEE Trans Syst Man Cybern 7(9):657–661MathSciNetMATHCrossRef Cover TM, Van Campenhout JM (1977) On the possible orderings in the measurement selection problem. IEEE Trans Syst Man Cybern 7(9):657–661MathSciNetMATHCrossRef
Zurück zum Zitat Daelemans W, Hoste V, De Meulder F, Naudts B (2003) Combined optimization of feature selection and algorithm parameters in machine learning of language. In: Machine learning: ECML 2003. Springer, pp 84–95 Daelemans W, Hoste V, De Meulder F, Naudts B (2003) Combined optimization of feature selection and algorithm parameters in machine learning of language. In: Machine learning: ECML 2003. Springer, pp 84–95
Zurück zum Zitat Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal 1(3):131–156CrossRef Dash M, Liu H (1997) Feature selection for classification. Intell Data Anal 1(3):131–156CrossRef
Zurück zum Zitat Dash M, Liu H, Motoda H (2000) Consistency based feature selection. In: Knowledge discovery and data mining. Current issues and new applications. Springer, pp 98–109 Dash M, Liu H, Motoda H (2000) Consistency based feature selection. In: Knowledge discovery and data mining. Current issues and new applications. Springer, pp 98–109
Zurück zum Zitat Davies DL, Bouldin DW (1979) A cluster separation measure. IEEE Trans Pattern Anal Mach Intell 2:224–227CrossRef Davies DL, Bouldin DW (1979) A cluster separation measure. IEEE Trans Pattern Anal Mach Intell 2:224–227CrossRef
Zurück zum Zitat Doak J (1992) An evaluation of feature selection methods and their application to computer security. University of California, Computer Science Doak J (1992) An evaluation of feature selection methods and their application to computer security. University of California, Computer Science
Zurück zum Zitat Duda RO, Hart PE, Stork DG (2001) Pattern classification, 2nd edn. Wiley, New YorkMATH Duda RO, Hart PE, Stork DG (2001) Pattern classification, 2nd edn. Wiley, New YorkMATH
Zurück zum Zitat Dy JG (2008) Unsupervised feature selection. Computational methods of feature selection, pp 19–39 Dy JG (2008) Unsupervised feature selection. Computational methods of feature selection, pp 19–39
Zurück zum Zitat Dy JG, Brodley CE (2000) Feature subset selection and order identification for unsupervised learning. In: ICML, pp 247–254 Dy JG, Brodley CE (2000) Feature subset selection and order identification for unsupervised learning. In: ICML, pp 247–254
Zurück zum Zitat Dy JG, Brodley CE (2004) Feature selection for unsupervised learning. J Mach Learn Res 5:845–889MathSciNetMATH Dy JG, Brodley CE (2004) Feature selection for unsupervised learning. J Mach Learn Res 5:845–889MathSciNetMATH
Zurück zum Zitat Figueiredo MAT, Jain AK, Law MH (2003) A feature selection wrapper for mixtures. In: Perales FJ, Campilho AJC, de la Blanca NP, Sanfeliu A (eds) Pattern recognition and image analysis. IbPRIA 2003. Lecture notes in computer science, vol 2652. Springer, Berlin, pp 229–237 Figueiredo MAT, Jain AK, Law MH (2003) A feature selection wrapper for mixtures. In: Perales FJ, Campilho AJC, de la Blanca NP, Sanfeliu A (eds) Pattern recognition and image analysis. IbPRIA 2003. Lecture notes in computer science, vol 2652. Springer, Berlin, pp 229–237
Zurück zum Zitat Figueiredo MA, Leitão JM, Jain AK (1999) On fitting mixture models. In: International workshop on energy minimization methods in computer vision and pattern recognition. Springer, pp 54–69 Figueiredo MA, Leitão JM, Jain AK (1999) On fitting mixture models. In: International workshop on energy minimization methods in computer vision and pattern recognition. Springer, pp 54–69
Zurück zum Zitat Forman G (2003) An extensive empirical study of feature selection metrics for text classification. J Mach Learn Res 3:1289–1305MATH Forman G (2003) An extensive empirical study of feature selection metrics for text classification. J Mach Learn Res 3:1289–1305MATH
Zurück zum Zitat Frühwirth-Schnatter S (2001) Markov chain Monte Carlo estimation of classical and dynamic switching and mixture models. J Am Stat Assoc 96(453):194–209MathSciNetMATHCrossRef Frühwirth-Schnatter S (2001) Markov chain Monte Carlo estimation of classical and dynamic switching and mixture models. J Am Stat Assoc 96(453):194–209MathSciNetMATHCrossRef
Zurück zum Zitat Gales MJ (1999) Semi-tied covariance matrices for hidden Markov models. IEEE Trans Speech Audio Process 7(3):272–281CrossRef Gales MJ (1999) Semi-tied covariance matrices for hidden Markov models. IEEE Trans Speech Audio Process 7(3):272–281CrossRef
Zurück zum Zitat Gales MJ, Knill KM, Young SJ (1999) State-based Gaussian selection in large vocabulary continuous speech recognition using HMMs. IEEE Trans Speech Audio Process 7(2):152–161CrossRef Gales MJ, Knill KM, Young SJ (1999) State-based Gaussian selection in large vocabulary continuous speech recognition using HMMs. IEEE Trans Speech Audio Process 7(2):152–161CrossRef
Zurück zum Zitat Galimberti G, Manisi A, Soffritti G (2017) Modelling the role of variables in model-based cluster analysis. Stat Comput 1–25 Galimberti G, Manisi A, Soffritti G (2017) Modelling the role of variables in model-based cluster analysis. Stat Comput 1–25
Zurück zum Zitat Galimberti G, Montanari A, Viroli C (2009) Penalized factor mixture analysis for variable selection in clustered data. Comput Stat Data Anal 53(12):4301–4310MathSciNetMATHCrossRef Galimberti G, Montanari A, Viroli C (2009) Penalized factor mixture analysis for variable selection in clustered data. Comput Stat Data Anal 53(12):4301–4310MathSciNetMATHCrossRef
Zurück zum Zitat Godino-Llorente JI, Gomez-Vilda P, Blanco-Velasco M (2006) Dimensionality reduction of a pathological voice quality assessment system based on Gaussian mixture models and short-term cepstral parameters. IEEE Trans Biomed Eng 53(10):1943–1953CrossRef Godino-Llorente JI, Gomez-Vilda P, Blanco-Velasco M (2006) Dimensionality reduction of a pathological voice quality assessment system based on Gaussian mixture models and short-term cepstral parameters. IEEE Trans Biomed Eng 53(10):1943–1953CrossRef
Zurück zum Zitat Graham MW, Miller DJ (2006) Unsupervised learning of parsimonious mixtures on large spaces with integrated feature and component selection. IEEE Trans Signal Process 54(4):1289–1303MATHCrossRef Graham MW, Miller DJ (2006) Unsupervised learning of parsimonious mixtures on large spaces with integrated feature and component selection. IEEE Trans Signal Process 54(4):1289–1303MATHCrossRef
Zurück zum Zitat Günter S, Bunke H (2003) Fast feature selection in an HMM-based multiple classifier system for handwriting recognition. In: Joint pattern recognition symposium. Springer, pp 289–296 Günter S, Bunke H (2003) Fast feature selection in an HMM-based multiple classifier system for handwriting recognition. In: Joint pattern recognition symposium. Springer, pp 289–296
Zurück zum Zitat Guo J, Levina E, Michailidis G, Zhu J (2010) Pairwise variable selection for high-dimensional model-based clustering. Biometrics 66(3):793–804MathSciNetMATHCrossRef Guo J, Levina E, Michailidis G, Zhu J (2010) Pairwise variable selection for high-dimensional model-based clustering. Biometrics 66(3):793–804MathSciNetMATHCrossRef
Zurück zum Zitat Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182MATH Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182MATH
Zurück zum Zitat Jain AK, Duin RP, Mao J (2000) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 22(1):4–37CrossRef Jain AK, Duin RP, Mao J (2000) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 22(1):4–37CrossRef
Zurück zum Zitat Jasra A, Holmes C, Stephens D (2005) Markov chain Monte Carlo methods and the label switching problem in Bayesian mixture modeling. Stat Sci 50–67 Jasra A, Holmes C, Stephens D (2005) Markov chain Monte Carlo methods and the label switching problem in Bayesian mixture modeling. Stat Sci 50–67
Zurück zum Zitat Ji S, Krishnapuram B, Carin L (2006) Variational Bayes for continuous hidden Markov models and its application to active learning. IEEE Trans Pattern Anal Mach Intell 28(4):522–532CrossRef Ji S, Krishnapuram B, Carin L (2006) Variational Bayes for continuous hidden Markov models and its application to active learning. IEEE Trans Pattern Anal Mach Intell 28(4):522–532CrossRef
Zurück zum Zitat John GH, Kohavi R, Pfleger K (1994) Irrelevant features and the subset selection problem. In: Machine learning: proceedings of the eleventh international conference, pp 121–129 John GH, Kohavi R, Pfleger K (1994) Irrelevant features and the subset selection problem. In: Machine learning: proceedings of the eleventh international conference, pp 121–129
Zurück zum Zitat Kerroum MA, Hammouch A, Aboutajdine D (2010) Textural feature selection by joint mutual information based on Gaussian mixture model for multispectral image classification. Pattern Recognit Lett 31(10):1168–1174CrossRef Kerroum MA, Hammouch A, Aboutajdine D (2010) Textural feature selection by joint mutual information based on Gaussian mixture model for multispectral image classification. Pattern Recognit Lett 31(10):1168–1174CrossRef
Zurück zum Zitat Khreich W, Granger E, Miri A, Sabourin R (2012) A survey of techniques for incremental learning of HMM parameters. Inf Sci 197:105–130CrossRef Khreich W, Granger E, Miri A, Sabourin R (2012) A survey of techniques for incremental learning of HMM parameters. Inf Sci 197:105–130CrossRef
Zurück zum Zitat Kim S, Tadesse MG, Vannucci M (2006) Variable selection in clustering via Dirichlet process mixture models. Biometrika 93(4):877–893MathSciNetMATHCrossRef Kim S, Tadesse MG, Vannucci M (2006) Variable selection in clustering via Dirichlet process mixture models. Biometrika 93(4):877–893MathSciNetMATHCrossRef
Zurück zum Zitat Kira K, Rendell LA (1992) The feature selection problem: traditional methods and a new algorithm. AAAI 2:129–134 Kira K, Rendell LA (1992) The feature selection problem: traditional methods and a new algorithm. AAAI 2:129–134
Zurück zum Zitat Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1):273–324MATHCrossRef Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1):273–324MATHCrossRef
Zurück zum Zitat Kononenko I (1994) Estimating attributes: analysis and extensions of Relief. In: Machine learning: ECML-94. Springer, pp 171–182 Kononenko I (1994) Estimating attributes: analysis and extensions of Relief. In: Machine learning: ECML-94. Springer, pp 171–182
Zurück zum Zitat Krishnan S, Samudravijaya K, Rao P (1996) Feature selection for pattern classification with Gaussian mixture models: a new objective criterion. Pattern Recognit Lett 17(8):803–809CrossRef Krishnan S, Samudravijaya K, Rao P (1996) Feature selection for pattern classification with Gaussian mixture models: a new objective criterion. Pattern Recognit Lett 17(8):803–809CrossRef
Zurück zum Zitat Law MH, Figueiredo MA, Jain AK (2004) Simultaneous feature selection and clustering using mixture models. IEEE Trans Pattern Anal Mach Intell 26(9):1154–1166CrossRef Law MH, Figueiredo MA, Jain AK (2004) Simultaneous feature selection and clustering using mixture models. IEEE Trans Pattern Anal Mach Intell 26(9):1154–1166CrossRef
Zurück zum Zitat Law MH, Jain AK, Figueiredo M (2002) Feature selection in mixture-based clustering. In: Advances in neural information processing systems, pp 625–632 Law MH, Jain AK, Figueiredo M (2002) Feature selection in mixture-based clustering. In: Advances in neural information processing systems, pp 625–632
Zurück zum Zitat Li X, Bilmes J (2003) Feature pruning in likelihood evaluation of HMM-based speech recognition. In: 2003 IEEE workshop on automatic speech recognition and understanding, 2003. ASRU’03. IEEE, pp 303–308 Li X, Bilmes J (2003) Feature pruning in likelihood evaluation of HMM-based speech recognition. In: 2003 IEEE workshop on automatic speech recognition and understanding, 2003. ASRU’03. IEEE, pp 303–308
Zurück zum Zitat Li X, Bilmes J (2005) Feature pruning for low-power ASR systems in clean and noisy environments. IEEE Signal Process Lett 12(7):489–492CrossRef Li X, Bilmes J (2005) Feature pruning for low-power ASR systems in clean and noisy environments. IEEE Signal Process Lett 12(7):489–492CrossRef
Zurück zum Zitat Li Y, Dong M, Hua J (2008) Localized feature selection for clustering. Pattern Recognit Lett 29(1):10–18CrossRef Li Y, Dong M, Hua J (2008) Localized feature selection for clustering. Pattern Recognit Lett 29(1):10–18CrossRef
Zurück zum Zitat Li Y, Dong M, Hua J (2009) Simultaneous localized feature selection and model detection for Gaussian mixtures. IEEE Trans Pattern Anal Mach Intell 31(5):953–960CrossRef Li Y, Dong M, Hua J (2009) Simultaneous localized feature selection and model detection for Gaussian mixtures. IEEE Trans Pattern Anal Mach Intell 31(5):953–960CrossRef
Zurück zum Zitat Liu H, Yu L (2005) Toward integrating feature selection algorithms for classification and clustering. IEEE Trans Knowl Data Eng 17(4):491–502CrossRef Liu H, Yu L (2005) Toward integrating feature selection algorithms for classification and clustering. IEEE Trans Knowl Data Eng 17(4):491–502CrossRef
Zurück zum Zitat Liu X, Chen T (2003) Video-based face recognition using adaptive hidden Markov models. In: 2003 IEEE computer society conference on computer vision and pattern recognition, 2003. Proceedings, vol 1. IEEE, pp I–340 Liu X, Chen T (2003) Video-based face recognition using adaptive hidden Markov models. In: 2003 IEEE computer society conference on computer vision and pattern recognition, 2003. Proceedings, vol 1. IEEE, pp I–340
Zurück zum Zitat Liu X, Gong Y, Xu W, Zhu S (2002) Document clustering with cluster refinement and model selection capabilities. In: Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval. ACM, pp 191–198 Liu X, Gong Y, Xu W, Zhu S (2002) Document clustering with cluster refinement and model selection capabilities. In: Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval. ACM, pp 191–198
Zurück zum Zitat Lv F, Nevatia R (2006) Recognition and segmentation of 3-d human action using HMM and multi-class adaboost. In: Computer vision–ECCV 2006. Springer, pp 359–372 Lv F, Nevatia R (2006) Recognition and segmentation of 3-d human action using HMM and multi-class adaboost. In: Computer vision–ECCV 2006. Springer, pp 359–372
Zurück zum Zitat MacKay DJ (1992) A practical Bayesian framework for backpropagation networks. Neural Comput 4(3):448–472CrossRef MacKay DJ (1992) A practical Bayesian framework for backpropagation networks. Neural Comput 4(3):448–472CrossRef
Zurück zum Zitat Marbac M, Sedki M (2017) Variable selection for model-based clustering using the integrated complete-data likelihood. Stat Comput 27(4):1049–1063MathSciNetMATHCrossRef Marbac M, Sedki M (2017) Variable selection for model-based clustering using the integrated complete-data likelihood. Stat Comput 27(4):1049–1063MathSciNetMATHCrossRef
Zurück zum Zitat Maugis C, Celeux G, Martin-Magniette ML (2009) Variable selection for clustering with Gaussian mixture models. Biometrics 65(3):701–709MathSciNetMATHCrossRef Maugis C, Celeux G, Martin-Magniette ML (2009) Variable selection for clustering with Gaussian mixture models. Biometrics 65(3):701–709MathSciNetMATHCrossRef
Zurück zum Zitat Maugis C, Celeux G, Martin-Magniette ML (2009) Variable selection in model-based clustering: a general variable role modeling. Comput Stat Data Anal 53(11):3872–3882MathSciNetMATHCrossRef Maugis C, Celeux G, Martin-Magniette ML (2009) Variable selection in model-based clustering: a general variable role modeling. Comput Stat Data Anal 53(11):3872–3882MathSciNetMATHCrossRef
Zurück zum Zitat Maugis C, Michel B (2011) A non asymptotic penalized criterion for Gaussian mixture model selection. ESAIM Probab Stat 15:41–68MathSciNetMATHCrossRef Maugis C, Michel B (2011) A non asymptotic penalized criterion for Gaussian mixture model selection. ESAIM Probab Stat 15:41–68MathSciNetMATHCrossRef
Zurück zum Zitat McLachlan GJ, Peel D (2000) Mixtures of factor analyzers. In: Proceedings of the seventeenth international conference on machine learning. Morgan Kaufmann Publishers Inc, pp 599–606 McLachlan GJ, Peel D (2000) Mixtures of factor analyzers. In: Proceedings of the seventeenth international conference on machine learning. Morgan Kaufmann Publishers Inc, pp 599–606
Zurück zum Zitat Merialdo B (1988) Phonetic recognition using hidden Markov models and maximum mutual information training. In: 1988 international conference on acoustics, speech, and signal processing, 1988. ICASSP-88. IEEE, pp 111–114 Merialdo B (1988) Phonetic recognition using hidden Markov models and maximum mutual information training. In: 1988 international conference on acoustics, speech, and signal processing, 1988. ICASSP-88. IEEE, pp 111–114
Zurück zum Zitat Meyer C (2002) Utterance-level boosting of HMM speech recognizers. In: 2002 IEEE international conference on acoustics, speech, and signal processing (ICASSP), vol 1. IEEE, pp I–109 Meyer C (2002) Utterance-level boosting of HMM speech recognizers. In: 2002 IEEE international conference on acoustics, speech, and signal processing (ICASSP), vol 1. IEEE, pp I–109
Zurück zum Zitat Minka TP (2001) Expectation propagation for approximate Bayesian inference. In: Proceedings of the seventeenth conference on uncertainty in artificial intelligence. Morgan Kaufmann Publishers Inc, pp 362–369 Minka TP (2001) Expectation propagation for approximate Bayesian inference. In: Proceedings of the seventeenth conference on uncertainty in artificial intelligence. Morgan Kaufmann Publishers Inc, pp 362–369
Zurück zum Zitat Mitra P, Murthy C, Pal SK (2002) Unsupervised feature selection using feature similarity. IEEE Trans Pattern Anal Mach Intell 24(3):301–312CrossRef Mitra P, Murthy C, Pal SK (2002) Unsupervised feature selection using feature similarity. IEEE Trans Pattern Anal Mach Intell 24(3):301–312CrossRef
Zurück zum Zitat Molina LC, Belanche L, Nebot À (2002) Feature selection algorithms: a survey and experimental evaluation. In: 2002 IEEE international conference on data mining, 2002. ICDM 2003. Proceedings. IEEE, pp 306–313 Molina LC, Belanche L, Nebot À (2002) Feature selection algorithms: a survey and experimental evaluation. In: 2002 IEEE international conference on data mining, 2002. ICDM 2003. Proceedings. IEEE, pp 306–313
Zurück zum Zitat Montero JA, Sucar LE (2004) Feature selection for visual gesture recognition using hidden Markov models. In: Proceedings of 5th international conference on computer science, 2004. ENC 2004. IEEE, pp 196–203 Montero JA, Sucar LE (2004) Feature selection for visual gesture recognition using hidden Markov models. In: Proceedings of 5th international conference on computer science, 2004. ENC 2004. IEEE, pp 196–203
Zurück zum Zitat Murphy KP (2012) Machine learning: a probabilistic perspective. The MIT Press, CambridgeMATH Murphy KP (2012) Machine learning: a probabilistic perspective. The MIT Press, CambridgeMATH
Zurück zum Zitat Narendra PM, Fukunaga K (1977) A branch and bound algorithm for feature subset selection. IEEE Trans Comput 100(9):917–922MATHCrossRef Narendra PM, Fukunaga K (1977) A branch and bound algorithm for feature subset selection. IEEE Trans Comput 100(9):917–922MATHCrossRef
Zurück zum Zitat Ng AY (1998) On feature selection: learning with exponentially many irrelevant features as training examples. In: Proceedings of the fifteenth international conference on machine learning. Morgan Kaufmann Publishers Inc, pp 404–412 Ng AY (1998) On feature selection: learning with exponentially many irrelevant features as training examples. In: Proceedings of the fifteenth international conference on machine learning. Morgan Kaufmann Publishers Inc, pp 404–412
Zurück zum Zitat Nouza J (1996) Feature selection methods for hidden Markov model-based speech recognition. In: Proceedings of 13th international conference on pattern recognition vol 2, pp 186–190 Nouza J (1996) Feature selection methods for hidden Markov model-based speech recognition. In: Proceedings of 13th international conference on pattern recognition vol 2, pp 186–190
Zurück zum Zitat Novovicová J, Pudil P, Kittler J (1996) Divergence based feature selection for multimodal class densities. IEEE Trans Pattern Anal Mach Intell 18(2):218–223CrossRef Novovicová J, Pudil P, Kittler J (1996) Divergence based feature selection for multimodal class densities. IEEE Trans Pattern Anal Mach Intell 18(2):218–223CrossRef
Zurück zum Zitat Olier I, Vellido A (2008) Advances in clustering and visualization of time series using GTM through time. Neural Netw 21(7):904–913MATHCrossRef Olier I, Vellido A (2008) Advances in clustering and visualization of time series using GTM through time. Neural Netw 21(7):904–913MATHCrossRef
Zurück zum Zitat Palaniappan R, Wissel T (2011) Considerations on strategies to improve EOG signal analysis. Int J Artif Life Res 2(3):6–21 Palaniappan R, Wissel T (2011) Considerations on strategies to improve EOG signal analysis. Int J Artif Life Res 2(3):6–21
Zurück zum Zitat Paliwal K (1992) Dimensionality reduction of the enhanced feature set for the HMM-based speech recognizer. Digital Signal Process 2(3):157–173CrossRef Paliwal K (1992) Dimensionality reduction of the enhanced feature set for the HMM-based speech recognizer. Digital Signal Process 2(3):157–173CrossRef
Zurück zum Zitat Pan W, Shen X (2007) Penalized model-based clustering with application to variable selection. J Mach Learn Res 8:1145–1164MATH Pan W, Shen X (2007) Penalized model-based clustering with application to variable selection. J Mach Learn Res 8:1145–1164MATH
Zurück zum Zitat Pan W, Shen X, Jiang A, Hebbel RP (2006) Semi-supervised learning via penalized mixture model with application to microarray sample classification. Bioinformatics 22(19):2388–2395CrossRef Pan W, Shen X, Jiang A, Hebbel RP (2006) Semi-supervised learning via penalized mixture model with application to microarray sample classification. Bioinformatics 22(19):2388–2395CrossRef
Zurück zum Zitat Pudil P, Ferri F, Novovicova J, Kittler J (1994a) Floating search methods for feature selection with nonmonotonic criterion functions. In: Proceedings of the twelveth international conference on pattern recognition, IAPR. Citeseer Pudil P, Ferri F, Novovicova J, Kittler J (1994a) Floating search methods for feature selection with nonmonotonic criterion functions. In: Proceedings of the twelveth international conference on pattern recognition, IAPR. Citeseer
Zurück zum Zitat Pudil P, Novovičová J, Kittler J (1994b) Floating search methods in feature selection. Pattern Recognit Lett 15(11):1119–1125 Pudil P, Novovičová J, Kittler J (1994b) Floating search methods in feature selection. Pattern Recognit Lett 15(11):1119–1125
Zurück zum Zitat Pudil P, Novovičová J, Choakjarernwanit N, Kittler J (1995) Feature selection based on the approximation of class densities by finite mixtures of special type. Pattern Recognit 28(9):1389–1398CrossRef Pudil P, Novovičová J, Choakjarernwanit N, Kittler J (1995) Feature selection based on the approximation of class densities by finite mixtures of special type. Pattern Recognit 28(9):1389–1398CrossRef
Zurück zum Zitat Rabiner L (1989) A tutorial on hidden Markov models and selected applications in speech recognition. Proc IEEE 77(2):257–286CrossRef Rabiner L (1989) A tutorial on hidden Markov models and selected applications in speech recognition. Proc IEEE 77(2):257–286CrossRef
Zurück zum Zitat Ribeiro PC, Santos-Victor J (2005) Human activity recognition from video: modeling, feature selection and classification architecture. In: Proceedings of international workshop on human activity recognition and modelling. Citeseer, pp 61–78 Ribeiro PC, Santos-Victor J (2005) Human activity recognition from video: modeling, feature selection and classification architecture. In: Proceedings of international workshop on human activity recognition and modelling. Citeseer, pp 61–78
Zurück zum Zitat Richardson S, Green PJ (1997) On Bayesian analysis of mixtures with an unknown number of components (with discussion). J R Stat Soc Ser B (Stat Methodol) 59(4):731–792MATHCrossRef Richardson S, Green PJ (1997) On Bayesian analysis of mixtures with an unknown number of components (with discussion). J R Stat Soc Ser B (Stat Methodol) 59(4):731–792MATHCrossRef
Zurück zum Zitat Robert CP, Ryden T, Titterington DM (2000) Bayesian inference in hidden Markov models through the reversible jump Markov chain Monte Carlo method. J R Stat Soc Ser B (Stat Methodol) 62(1):57–75MathSciNetMATHCrossRef Robert CP, Ryden T, Titterington DM (2000) Bayesian inference in hidden Markov models through the reversible jump Markov chain Monte Carlo method. J R Stat Soc Ser B (Stat Methodol) 62(1):57–75MathSciNetMATHCrossRef
Zurück zum Zitat Robnik-Šikonja M, Kononenko I (2003) Theoretical and empirical analysis of ReliefF and RReliefF. Mach Learn 53(1–2):23–69MATHCrossRef Robnik-Šikonja M, Kononenko I (2003) Theoretical and empirical analysis of ReliefF and RReliefF. Mach Learn 53(1–2):23–69MATHCrossRef
Zurück zum Zitat Rydén T et al (2008) EM versus Markov chain Monte Carlo for estimation of hidden Markov models: a computational perspective. Bayesian Anal 3(4):659–688MathSciNetMATHCrossRef Rydén T et al (2008) EM versus Markov chain Monte Carlo for estimation of hidden Markov models: a computational perspective. Bayesian Anal 3(4):659–688MathSciNetMATHCrossRef
Zurück zum Zitat Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517CrossRef Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517CrossRef
Zurück zum Zitat Schwenk H (1999) Using boosting to improve a hybrid HMM/neural network speech recognizer. In: 1999 IEEE international conference on acoustics, speech, and signal processing, 1999. Proceedings, vol 2. IEEE, pp 1009–1012 Schwenk H (1999) Using boosting to improve a hybrid HMM/neural network speech recognizer. In: 1999 IEEE international conference on acoustics, speech, and signal processing, 1999. Proceedings, vol 2. IEEE, pp 1009–1012
Zurück zum Zitat Scott SL (2002) Bayesian methods for hidden Markov models: recursive computing in the 21st century. J Am Stat Assoc 97(457):337–351 Scott SL (2002) Bayesian methods for hidden Markov models: recursive computing in the 21st century. J Am Stat Assoc 97(457):337–351
Zurück zum Zitat Scrucca L (2016) Genetic algorithms for subset selection in model-based clustering. In: Unsupervised learning algorithms. Springer, pp 55–70 Scrucca L (2016) Genetic algorithms for subset selection in model-based clustering. In: Unsupervised learning algorithms. Springer, pp 55–70
Zurück zum Zitat Somol P, Pudil P, Kittler J (2004) Fast branch & bound algorithms for optimal feature selection. IEEE Trans Pattern Anal Mach Intell 26(7):900–912CrossRef Somol P, Pudil P, Kittler J (2004) Fast branch & bound algorithms for optimal feature selection. IEEE Trans Pattern Anal Mach Intell 26(7):900–912CrossRef
Zurück zum Zitat Städler N, Mukherjee S et al (2013) Penalized estimation in high-dimensional hidden Markov models with state-specific graphical models. Ann Appl Stat 7(4):2157–2179MathSciNetMATHCrossRef Städler N, Mukherjee S et al (2013) Penalized estimation in high-dimensional hidden Markov models with state-specific graphical models. Ann Appl Stat 7(4):2157–2179MathSciNetMATHCrossRef
Zurück zum Zitat Steinley D, Brusco MJ (2008) Selection of variables in cluster analysis: an empirical comparison of eight procedures. Psychometrika 73(1):125–144MathSciNetMATHCrossRef Steinley D, Brusco MJ (2008) Selection of variables in cluster analysis: an empirical comparison of eight procedures. Psychometrika 73(1):125–144MathSciNetMATHCrossRef
Zurück zum Zitat Swartz MD, Mo Q, Murphy ME, Lupton JR, Turner ND, Hong MY, Vannucci M (2008) Bayesian variable selection in clustering high-dimensional data with substructure. J Agric Biol Environ Stat 13(4):407–423MathSciNetMATHCrossRef Swartz MD, Mo Q, Murphy ME, Lupton JR, Turner ND, Hong MY, Vannucci M (2008) Bayesian variable selection in clustering high-dimensional data with substructure. J Agric Biol Environ Stat 13(4):407–423MathSciNetMATHCrossRef
Zurück zum Zitat Tadesse MG, Sha N, Vannucci M (2005) Bayesian variable selection in clustering high-dimensional data. J Am Stat Assoc 100(470):602–617MathSciNetMATHCrossRef Tadesse MG, Sha N, Vannucci M (2005) Bayesian variable selection in clustering high-dimensional data. J Am Stat Assoc 100(470):602–617MathSciNetMATHCrossRef
Zurück zum Zitat Valente F, Wellekens C (2004) Variational Bayesian feature selection for Gaussian mixture models. In: IEEE international conference on acoustics, speech, and signal processing, 2004. Proceedings.(ICASSP’04), vol 1. IEEE, pp I–513 Valente F, Wellekens C (2004) Variational Bayesian feature selection for Gaussian mixture models. In: IEEE international conference on acoustics, speech, and signal processing, 2004. Proceedings.(ICASSP’04), vol 1. IEEE, pp I–513
Zurück zum Zitat Vannucci M, Stingo FC (2010) Bayesian models for variable selection that incorporate biological information. Bayesian Stat 9:659–678 Vannucci M, Stingo FC (2010) Bayesian models for variable selection that incorporate biological information. Bayesian Stat 9:659–678
Zurück zum Zitat Vellido A (2006) Assessment of an unsupervised feature selection method for generative topographic mapping. In: International conference on artificial neural networks. Springer, pp 361–370 Vellido A (2006) Assessment of an unsupervised feature selection method for generative topographic mapping. In: International conference on artificial neural networks. Springer, pp 361–370
Zurück zum Zitat Vellido A, Lisboa PJ, Vicente D (2006) Robust analysis of MRS brain tumour data using t-GTM. Neurocomputing 69(7):754–768CrossRef Vellido A, Lisboa PJ, Vicente D (2006) Robust analysis of MRS brain tumour data using t-GTM. Neurocomputing 69(7):754–768CrossRef
Zurück zum Zitat Vellido A, Velazco J (2008) The effect of noise and sample size on an unsupervised feature selection method for manifold learning. In: IEEE international joint conference on neural networks, 2008. IJCNN 2008 (IEEE world congress on computational intelligence). IEEE, pp 522–527 Vellido A, Velazco J (2008) The effect of noise and sample size on an unsupervised feature selection method for manifold learning. In: IEEE international joint conference on neural networks, 2008. IJCNN 2008 (IEEE world congress on computational intelligence). IEEE, pp 522–527
Zurück zum Zitat Wang S, Zhu J (2008) Variable selection for model-based high-dimensional clustering and its application to microarray data. Biometrics 64(2):440–448MathSciNetMATHCrossRef Wang S, Zhu J (2008) Variable selection for model-based high-dimensional clustering and its application to microarray data. Biometrics 64(2):440–448MathSciNetMATHCrossRef
Zurück zum Zitat Wei X, Li C (2011) The Student’s \(t\) -hidden Markov model with truncated stick-breaking priors. IEEE Signal Process Lett 18(6):355–358CrossRef Wei X, Li C (2011) The Student’s \(t\) -hidden Markov model with truncated stick-breaking priors. IEEE Signal Process Lett 18(6):355–358CrossRef
Zurück zum Zitat Windridge D, Bowden R (2005) Hidden Markov chain estimation and parameterisation via ICA-based feature-selection. Pattern Anal Appl 8(1–2):115–124MathSciNetCrossRef Windridge D, Bowden R (2005) Hidden Markov chain estimation and parameterisation via ICA-based feature-selection. Pattern Anal Appl 8(1–2):115–124MathSciNetCrossRef
Zurück zum Zitat Wissel T, Pfeiffer T, Frysch R, Knight RT, Chang EF, Hinrichs H, Rieger JW, Rose G (2013) Hidden Markov model and support vector machine based decoding of finger movements using electrocorticography. J Neural Eng 10(5):056,020 Wissel T, Pfeiffer T, Frysch R, Knight RT, Chang EF, Hinrichs H, Rieger JW, Rose G (2013) Hidden Markov model and support vector machine based decoding of finger movements using electrocorticography. J Neural Eng 10(5):056,020
Zurück zum Zitat Xie B, Pan W, Shen X (2008) Penalized model-based clustering with cluster-specific diagonal covariance matrices and grouped variables. Electron J Stat 2:168MathSciNetCrossRef Xie B, Pan W, Shen X (2008) Penalized model-based clustering with cluster-specific diagonal covariance matrices and grouped variables. Electron J Stat 2:168MathSciNetCrossRef
Zurück zum Zitat Xie B, Pan W, Shen X (2008) Variable selection in penalized model-based clustering via regularization on grouped parameters. Biometrics 64(3):921–930MathSciNetMATHCrossRef Xie B, Pan W, Shen X (2008) Variable selection in penalized model-based clustering via regularization on grouped parameters. Biometrics 64(3):921–930MathSciNetMATHCrossRef
Zurück zum Zitat Xie B, Pan W, Shen X (2010) Penalized mixtures of factor analyzers with application to clustering high-dimensional microarray data. Bioinformatics 26(4):501–508CrossRef Xie B, Pan W, Shen X (2010) Penalized mixtures of factor analyzers with application to clustering high-dimensional microarray data. Bioinformatics 26(4):501–508CrossRef
Zurück zum Zitat Xie L, Chang SF, Divakaran A, Sun H (2002) Structure analysis of soccer video with hidden Markov models. In: Proceedings of IEEE international conferene on acoustics, speech, and signal processing, vol 4 Xie L, Chang SF, Divakaran A, Sun H (2002) Structure analysis of soccer video with hidden Markov models. In: Proceedings of IEEE international conferene on acoustics, speech, and signal processing, vol 4
Zurück zum Zitat Yin P, Essa I, Rehg JM (2004) Asymmetrically boosted HMM for speech reading. In: Proceedings of the 2004 IEEE computer society conference on computer vision and pattern recognition, 2004. CVPR 2004, vol 2. IEEE, p II-755 Yin P, Essa I, Rehg JM (2004) Asymmetrically boosted HMM for speech reading. In: Proceedings of the 2004 IEEE computer society conference on computer vision and pattern recognition, 2004. CVPR 2004, vol 2. IEEE, p II-755
Zurück zum Zitat Yin P, Essa I, Starner T, Rehg JM (2008) Discriminative feature selection for hidden Markov models using segmental boosting. In: IEEE international conference on acoustics, speech and signal processing, 2008. ICASSP 2008. IEEE, pp 2001–2004 Yin P, Essa I, Starner T, Rehg JM (2008) Discriminative feature selection for hidden Markov models using segmental boosting. In: IEEE international conference on acoustics, speech and signal processing, 2008. ICASSP 2008. IEEE, pp 2001–2004
Zurück zum Zitat Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation-based filter solution. ICML 3:856–863 Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation-based filter solution. ICML 3:856–863
Zurück zum Zitat Zeng H, Cheung YM (2009) A new feature selection method for Gaussian mixture clustering. Pattern Recognit 42(2):243–250MATHCrossRef Zeng H, Cheung YM (2009) A new feature selection method for Gaussian mixture clustering. Pattern Recognit 42(2):243–250MATHCrossRef
Zurück zum Zitat Zhou J, Zhang XP (2008) An ICA mixture hidden Markov model for video content analysis. IEEE Trans Circuits Syst Video Technol 18(11):1576–1586CrossRef Zhou J, Zhang XP (2008) An ICA mixture hidden Markov model for video content analysis. IEEE Trans Circuits Syst Video Technol 18(11):1576–1586CrossRef
Zurück zum Zitat Zhu H, He Z, Leung H (2012) Simultaneous feature and model selection for continuous hidden Markov models. IEEE Signal Process Lett 19(5):279–282CrossRef Zhu H, He Z, Leung H (2012) Simultaneous feature and model selection for continuous hidden Markov models. IEEE Signal Process Lett 19(5):279–282CrossRef
Zurück zum Zitat Zhu K, Hong G, Wong Y (2008) A comparative study of feature selection for hidden Markov model-based micro-milling tool wear monitoring. Mach Sci Technol 12(3):348–369CrossRef Zhu K, Hong G, Wong Y (2008) A comparative study of feature selection for hidden Markov model-based micro-milling tool wear monitoring. Mach Sci Technol 12(3):348–369CrossRef
Metadaten
Titel
A survey of feature selection methods for Gaussian mixture models and hidden Markov models
verfasst von
Stephen Adams
Peter A. Beling
Publikationsdatum
25.09.2017
Verlag
Springer Netherlands
Erschienen in
Artificial Intelligence Review / Ausgabe 3/2019
Print ISSN: 0269-2821
Elektronische ISSN: 1573-7462
DOI
https://doi.org/10.1007/s10462-017-9581-3

Weitere Artikel der Ausgabe 3/2019

Artificial Intelligence Review 3/2019 Zur Ausgabe

Premium Partner