Skip to main content
Erschienen in: Journal of Classification 1/2024

25.01.2024

Soft Label Guided Unsupervised Discriminative Sparse Subspace Feature Selection

verfasst von: Keding Chen, Yong Peng, Feiping Nie, Wanzeng Kong

Erschienen in: Journal of Classification | Ausgabe 1/2024

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Feature selection and subspace learning are two primary methods to achieve data dimensionality reduction and discriminability enhancement. However, data label information is unavailable in unsupervised learning to guide the dimensionality reduction process. To this end, we propose a soft label guided unsupervised discriminative sparse subspace feature selection (UDS\(^2\)FS) model in this paper, which consists of two superiorities in comparison with the existing studies. On the one hand, UDS\(^2\)FS aims to find a discriminative subspace to simultaneously maximize the between-class data scatter and minimize the within-class scatter. On the other hand, UDS\(^2\)FS estimates the data label information in the learned subspace, which further serves as the soft labels to guide the discriminative subspace learning process. Moreover, the \(\ell _{2,0}\)-norm is imposed to achieve row sparsity of the subspace projection matrix, which is parameter-free and more stable compared to the \(\ell _{2,1}\)-norm. Experimental studies to evaluate the performance of UDS\(^2\)FS are performed from three aspects, i.e., a synthetic data set to check its iterative optimization process, several toy data sets to visualize the feature selection effect, and some benchmark data sets to examine the clustering performance of UDS\(^2\)FS. From the obtained results, UDS\(^2\)FS exhibits competitive performance in joint subspace learning and feature selection in comparison with some related models.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Aibinu, A., Salau, H. B., Rahman, N. A., Nwohu, M., & Akachukwu, C. (2016). A novel clustering based genetic algorithm for route optimization. Engineering Science and Technology, an International Journal, 19(4), 2022–2034.CrossRef Aibinu, A., Salau, H. B., Rahman, N. A., Nwohu, M., & Akachukwu, C. (2016). A novel clustering based genetic algorithm for route optimization. Engineering Science and Technology, an International Journal, 19(4), 2022–2034.CrossRef
Zurück zum Zitat Ayar, M., Isazadeh, A., Gharehchopogh, F. S., & Seyedi, M. (2022). Chaotic-based divide-and-conquer feature selection method and its application in cardiac arrhythmia classification. The Journal of Supercomputing, 78, 5856–5882.CrossRef Ayar, M., Isazadeh, A., Gharehchopogh, F. S., & Seyedi, M. (2022). Chaotic-based divide-and-conquer feature selection method and its application in cardiac arrhythmia classification. The Journal of Supercomputing, 78, 5856–5882.CrossRef
Zurück zum Zitat Brahim, A. B., & Limam, M. (2018). Ensemble feature selection for high dimensional data: A new method and a comparative study. Advances in Data Analysis and Classification, 12(4), 937–952.MathSciNetCrossRef Brahim, A. B., & Limam, M. (2018). Ensemble feature selection for high dimensional data: A new method and a comparative study. Advances in Data Analysis and Classification, 12(4), 937–952.MathSciNetCrossRef
Zurück zum Zitat Cai, X., Nie, F., & Huang, H. (2013). Exact top-$k$ feature selection via \(\ell \)2,0-norm constraint. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1240–1246 Cai, X., Nie, F., & Huang, H. (2013). Exact top-$k$ feature selection via \(\ell \)2,0-norm constraint. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1240–1246
Zurück zum Zitat Chao, G., Sun, S., & Bi, J. (2021). A survey on multiview clustering. IEEE Transactions on Artificial Intelligence, 2(2), 146–168.CrossRef Chao, G., Sun, S., & Bi, J. (2021). A survey on multiview clustering. IEEE Transactions on Artificial Intelligence, 2(2), 146–168.CrossRef
Zurück zum Zitat Chen, X., Yuan, G., Nie, F., & Huang, Z. J. (2017). Semi-supervised feature selection via rescaled linear regression. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1525–1531 Chen, X., Yuan, G., Nie, F., & Huang, Z. J. (2017). Semi-supervised feature selection via rescaled linear regression. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1525–1531
Zurück zum Zitat Chen, X., Hong, W., Nie, F., He, D., Yang, M., & Huang, J. Z. (2018). Spectral clustering of large-scale data by directly solving normalized cut. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1206–1215 Chen, X., Hong, W., Nie, F., He, D., Yang, M., & Huang, J. Z. (2018). Spectral clustering of large-scale data by directly solving normalized cut. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1206–1215
Zurück zum Zitat Chen, X., Yuan, G., Nie, F., & Ming, Z. (2020). Semi-supervised feature selection via sparse rescaled linear square regression. IEEE Transactions on Knowledge and Data Engineering, 32(1), 165–176.CrossRef Chen, X., Yuan, G., Nie, F., & Ming, Z. (2020). Semi-supervised feature selection via sparse rescaled linear square regression. IEEE Transactions on Knowledge and Data Engineering, 32(1), 165–176.CrossRef
Zurück zum Zitat Deng, P., Li, T., Wang, H., Wang, D., Horng, S. J., & Liu, R. (2023). Graph regularized sparse non-negative matrix factorization for clustering. IEEE Transactions on Computational Social Systems, 10(3), 910–921.CrossRef Deng, P., Li, T., Wang, H., Wang, D., Horng, S. J., & Liu, R. (2023). Graph regularized sparse non-negative matrix factorization for clustering. IEEE Transactions on Computational Social Systems, 10(3), 910–921.CrossRef
Zurück zum Zitat Fang, X., Hu, Y., Zhou, P., & Wu, D. (2022). ANIMC: A soft approach for autoweighted noisy and incomplete multiview clustering. IEEE Transactions on Artificial Intelligence, 3(2), 192–206.CrossRef Fang, X., Hu, Y., Zhou, P., & Wu, D. (2022). ANIMC: A soft approach for autoweighted noisy and incomplete multiview clustering. IEEE Transactions on Artificial Intelligence, 3(2), 192–206.CrossRef
Zurück zum Zitat Gharehchopogh, F. S. (2023). Quantum-inspired metaheuristic algorithms: Comprehensive survey and classification. Artificial Intelligence Review, 56(6), 5479–5543.CrossRef Gharehchopogh, F. S. (2023). Quantum-inspired metaheuristic algorithms: Comprehensive survey and classification. Artificial Intelligence Review, 56(6), 5479–5543.CrossRef
Zurück zum Zitat Gharehchopogh, F. S., & Khargoush, A. A. (2023). A chaotic-based interactive autodidactic school algorithm for data clustering problems and its application on COVID-19 disease detection. Symmetry, 15(4), 894.CrossRef Gharehchopogh, F. S., & Khargoush, A. A. (2023). A chaotic-based interactive autodidactic school algorithm for data clustering problems and its application on COVID-19 disease detection. Symmetry, 15(4), 894.CrossRef
Zurück zum Zitat Gharehchopogh, F. S., Namazi, M., Ebrahimi, L., & Abdollahzadeh, B. (2023). Advances in sparrow search algorithm: A comprehensive survey. Archives of Computational Methods in Engineering, 30(1), 427–455.CrossRef Gharehchopogh, F. S., Namazi, M., Ebrahimi, L., & Abdollahzadeh, B. (2023). Advances in sparrow search algorithm: A comprehensive survey. Archives of Computational Methods in Engineering, 30(1), 427–455.CrossRef
Zurück zum Zitat Gharehchopogh, F. S., Ucan, A., Ibrikci, T., Arasteh, B., & Isik, G. (2023). Slime mould algorithm: A comprehensive survey of its variants and applications. Archives of Computational Methods in Engineering, 30(4), 2683–2723.CrossRef Gharehchopogh, F. S., Ucan, A., Ibrikci, T., Arasteh, B., & Isik, G. (2023). Slime mould algorithm: A comprehensive survey of its variants and applications. Archives of Computational Methods in Engineering, 30(4), 2683–2723.CrossRef
Zurück zum Zitat Greenlaw, R., & Kantabutra, S. (2013). Survey of clustering: Algorithms and applications. International Journal of Information Retrieval Research, 3, 1–29.CrossRef Greenlaw, R., & Kantabutra, S. (2013). Survey of clustering: Algorithms and applications. International Journal of Information Retrieval Research, 3, 1–29.CrossRef
Zurück zum Zitat Gu, Q., Li, Z., & Han, J. (2011). Joint feature selection and subspace learning. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1294–1299 Gu, Q., Li, Z., & Han, J. (2011). Joint feature selection and subspace learning. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1294–1299
Zurück zum Zitat He, X., & Niyogi P (2003). Locality preserving projections. In Proceedings of the Advances in Neural Information Processing Systems, pp. 153–160 He, X., & Niyogi P (2003). Locality preserving projections. In Proceedings of the Advances in Neural Information Processing Systems, pp. 153–160
Zurück zum Zitat Hou, C., Nie, F., Yi, D., & Tao, D. (2014). Discriminative embedded clustering: A framework for grouping high-dimensional data. IEEE Transactions on Neural Networks and Learning Systems, 26(6), 1287–1299.MathSciNet Hou, C., Nie, F., Yi, D., & Tao, D. (2014). Discriminative embedded clustering: A framework for grouping high-dimensional data. IEEE Transactions on Neural Networks and Learning Systems, 26(6), 1287–1299.MathSciNet
Zurück zum Zitat Hu, B., Dai, Y., Su, Y., Moore, P., Zhang, X., Mao, C., Chen, J., & Xu, L. (2018). Feature selection for optimized high-dimensional biomedical data using an improved shuffled frog leaping algorithm. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 15(6), 1765–1773.CrossRef Hu, B., Dai, Y., Su, Y., Moore, P., Zhang, X., Mao, C., Chen, J., & Xu, L. (2018). Feature selection for optimized high-dimensional biomedical data using an improved shuffled frog leaping algorithm. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 15(6), 1765–1773.CrossRef
Zurück zum Zitat Huang, J., Nie, F., & Huang, H. (2015). A new simplex sparse learning model to measure data similarity for clustering. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 3569–3575 Huang, J., Nie, F., & Huang, H. (2015). A new simplex sparse learning model to measure data similarity for clustering. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 3569–3575
Zurück zum Zitat Hubert, L., & Arabie, P. (1985). Comparing partitions. Journal of Classification, 2, 193–218.CrossRef Hubert, L., & Arabie, P. (1985). Comparing partitions. Journal of Classification, 2, 193–218.CrossRef
Zurück zum Zitat Jia, Y., Nie, F., & Zhang, C. (2009). Trace ratio problem revisited. IEEE Transactions on Neural Networks, 20(4), 729–735.CrossRef Jia, Y., Nie, F., & Zhang, C. (2009). Trace ratio problem revisited. IEEE Transactions on Neural Networks, 20(4), 729–735.CrossRef
Zurück zum Zitat Kamali, T., & Stashuk, D. W. (2020). Discovering density-based clustering structures using neighborhood distance entropy consistency. IEEE Transactions on Computational Social Systems, 7(4), 1069–1080.CrossRef Kamali, T., & Stashuk, D. W. (2020). Discovering density-based clustering structures using neighborhood distance entropy consistency. IEEE Transactions on Computational Social Systems, 7(4), 1069–1080.CrossRef
Zurück zum Zitat Li, X., Jing, Z., Hu, B., Zhu, J., Zhong, N., Li, M., Ding, Z., Yang, J., Zhang, L., & Feng, L. (2017). Majoe D (2017) A resting-state brain functional network study in MDD based on minimum spanning tree analysis and the hierarchical clustering. Complexity, 9514369, 1–11. Li, X., Jing, Z., Hu, B., Zhu, J., Zhong, N., Li, M., Ding, Z., Yang, J., Zhang, L., & Feng, L. (2017). Majoe D (2017) A resting-state brain functional network study in MDD based on minimum spanning tree analysis and the hierarchical clustering. Complexity, 9514369, 1–11.
Zurück zum Zitat Li, Z., & Tang, J. (2015). Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Transactions on Image Processing, 24(12), 5343–5355.MathSciNetCrossRef Li, Z., & Tang, J. (2015). Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Transactions on Image Processing, 24(12), 5343–5355.MathSciNetCrossRef
Zurück zum Zitat Li, Z., Yang, Y., Liu, J., Zhou, X., & Lu, H. (2012). Unsupervised feature selection using nonnegative spectral analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1026–1032 Li, Z., Yang, Y., Liu, J., Zhou, X., & Lu, H. (2012). Unsupervised feature selection using nonnegative spectral analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1026–1032
Zurück zum Zitat Liyanage, Y. W., Zois, D. S., & Chelmis, C. (2021). Dynamic instance-wise joint feature selection and classification. IEEE Transactions on Artificial Intelligence, 2(2), 169–184.CrossRef Liyanage, Y. W., Zois, D. S., & Chelmis, C. (2021). Dynamic instance-wise joint feature selection and classification. IEEE Transactions on Artificial Intelligence, 2(2), 169–184.CrossRef
Zurück zum Zitat Nie, F., Huang, H., Cai, X., & Ding, C. (2010). Efficient and robust feature selection via joint \(\ell \)2,1-norms minimization. In Proceedings of the Advances in Neural Information processing Systems, pp 1813–1821 Nie, F., Huang, H., Cai, X., & Ding, C. (2010). Efficient and robust feature selection via joint \(\ell \)2,1-norms minimization. In Proceedings of the Advances in Neural Information processing Systems, pp 1813–1821
Zurück zum Zitat Nie, F., Zhang, R., & Li, X. (2017). A generalized power iteration method for solving quadratic problem on the Stiefel manifold. Science China Information Sciences, 60(11), 146–155.MathSciNetCrossRef Nie, F., Zhang, R., & Li, X. (2017). A generalized power iteration method for solving quadratic problem on the Stiefel manifold. Science China Information Sciences, 60(11), 146–155.MathSciNetCrossRef
Zurück zum Zitat Nie, F., Dong, X., & Li, X. (2021). Unsupervised and semisupervised projection with graph optimization. IEEE Transactions on Neural Networks and Learning Systems, 32(4), 1547–1559.MathSciNetCrossRef Nie, F., Dong, X., & Li, X. (2021). Unsupervised and semisupervised projection with graph optimization. IEEE Transactions on Neural Networks and Learning Systems, 32(4), 1547–1559.MathSciNetCrossRef
Zurück zum Zitat Nie, F., Wang, Z., Tian, L., Wang, R., & Li, X. (2022). Subspace sparse discriminative feature selection. IEEE Transactions on Cybernetics, 52(6), 4221–4223.CrossRef Nie, F., Wang, Z., Tian, L., Wang, R., & Li, X. (2022). Subspace sparse discriminative feature selection. IEEE Transactions on Cybernetics, 52(6), 4221–4223.CrossRef
Zurück zum Zitat Nie, F., Zhao, X., Wang, R., & Li, X. (2023). Adaptive maximum entropy graph-guided fast local discriminant analysis. IEEE Transactions on Cybernetics, 53(6), 3574–3587.CrossRef Nie, F., Zhao, X., Wang, R., & Li, X. (2023). Adaptive maximum entropy graph-guided fast local discriminant analysis. IEEE Transactions on Cybernetics, 53(6), 3574–3587.CrossRef
Zurück zum Zitat Pappu, V., & Pardalos, P. M. (2014). High-dimensional data classification. In Clusters, Orders, and Trees: Methods and Applications, Springer, pp. 119–150 Pappu, V., & Pardalos, P. M. (2014). High-dimensional data classification. In Clusters, Orders, and Trees: Methods and Applications, Springer, pp. 119–150
Zurück zum Zitat Parida, P., & Bhoi, N. (2018). Fuzzy clustering based transition region extraction for image segmentation. Engineering Science and Technology, an International Journal, 21(4), 547–563.CrossRef Parida, P., & Bhoi, N. (2018). Fuzzy clustering based transition region extraction for image segmentation. Engineering Science and Technology, an International Journal, 21(4), 547–563.CrossRef
Zurück zum Zitat Peng, X., Yu, Z., Zhang, Y., & Tang, H. (2017). Constructing the L2-graph for robust subspace learning and subspace clustering. IEEE Transactions on Cybernetics, 47(4), 1053–1066.CrossRef Peng, X., Yu, Z., Zhang, Y., & Tang, H. (2017). Constructing the L2-graph for robust subspace learning and subspace clustering. IEEE Transactions on Cybernetics, 47(4), 1053–1066.CrossRef
Zurück zum Zitat Peng, Y., Li, Q., Kong, W., Qin, F., Zhang, J., & Cichocki, A. (2020). A joint optimization framework to semi-supervised RVFL and ELM networks for efficient data classification. Applied Soft Computing, 97(106756), 1–15. Peng, Y., Li, Q., Kong, W., Qin, F., Zhang, J., & Cichocki, A. (2020). A joint optimization framework to semi-supervised RVFL and ELM networks for efficient data classification. Applied Soft Computing, 97(106756), 1–15.
Zurück zum Zitat Peng, Y., Zhu, X., Nie, F., Kong, W., & Ge, Y. (2021). Fuzzy graph clustering. Information Sciences, 571, 38–49.MathSciNetCrossRef Peng, Y., Zhu, X., Nie, F., Kong, W., & Ge, Y. (2021). Fuzzy graph clustering. Information Sciences, 571, 38–49.MathSciNetCrossRef
Zurück zum Zitat Peng, Y., Qin, F., Kong, W., Ge, Y., Nie, F., & Cichocki, A. (2022). GFIL: A unified framework for the importance analysis of features, frequency bands and channels in EEG-based emotion recognition. IEEE Transactions on Cognitive and Developmental Systems, 14(3), 935–947.CrossRef Peng, Y., Qin, F., Kong, W., Ge, Y., Nie, F., & Cichocki, A. (2022). GFIL: A unified framework for the importance analysis of features, frequency bands and channels in EEG-based emotion recognition. IEEE Transactions on Cognitive and Developmental Systems, 14(3), 935–947.CrossRef
Zurück zum Zitat Piri, J., Mohapatra, P., Acharya, B., Gharehchopogh, F. S., Gerogiannis, V. C., Kanavos, A., & Manika, S. (2022). Feature selection using artificial gorilla troop optimization for biomedical data: A case analysis with COVID-19 data. Mathematics, 10(15), 2742.CrossRef Piri, J., Mohapatra, P., Acharya, B., Gharehchopogh, F. S., Gerogiannis, V. C., Kanavos, A., & Manika, S. (2022). Feature selection using artificial gorilla troop optimization for biomedical data: A case analysis with COVID-19 data. Mathematics, 10(15), 2742.CrossRef
Zurück zum Zitat Rand, W. M. (1971). Objective criteria for the evaluation of clustering methods. Journal of the American Statistical Association, 66(336), 846–850.CrossRef Rand, W. M. (1971). Objective criteria for the evaluation of clustering methods. Journal of the American Statistical Association, 66(336), 846–850.CrossRef
Zurück zum Zitat Song, P., & Zheng, W. (2020). Feature selection based transfer subspace learning for speech emotion recognition. IEEE Transactions on Affective Computing, 11(3), 373–382.CrossRef Song, P., & Zheng, W. (2020). Feature selection based transfer subspace learning for speech emotion recognition. IEEE Transactions on Affective Computing, 11(3), 373–382.CrossRef
Zurück zum Zitat Sun, Y., Babu, P., & Palomar, D. P. (2016). Majorization-minimization algorithms in signal processing, communications, and machine learning. IEEE Transactions on Signal Processing, 65(3), 794–816.MathSciNetCrossRef Sun, Y., Babu, P., & Palomar, D. P. (2016). Majorization-minimization algorithms in signal processing, communications, and machine learning. IEEE Transactions on Signal Processing, 65(3), 794–816.MathSciNetCrossRef
Zurück zum Zitat Vu, V. Q., & Lei, J. (2013). Minimax sparse principal subspace estimation in high dimensions. Annals of Statistics, 41(6), 2905–2947.MathSciNetCrossRef Vu, V. Q., & Lei, J. (2013). Minimax sparse principal subspace estimation in high dimensions. Annals of Statistics, 41(6), 2905–2947.MathSciNetCrossRef
Zurück zum Zitat Wang, K., He, R., Wang, L., Wang, W., & Tan, T. (2015). Joint feature selection and subspace learning for cross-modal retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(10), 2010–2023.CrossRef Wang, K., He, R., Wang, L., Wang, W., & Tan, T. (2015). Joint feature selection and subspace learning for cross-modal retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(10), 2010–2023.CrossRef
Zurück zum Zitat Wang, R., Bian, J., Nie, F., & Li, X. (2022). Unsupervised discriminative projection for feature selection. IEEE Transactions on Knowledge and Data Engineering, 34(2), 942–952.CrossRef Wang, R., Bian, J., Nie, F., & Li, X. (2022). Unsupervised discriminative projection for feature selection. IEEE Transactions on Knowledge and Data Engineering, 34(2), 942–952.CrossRef
Zurück zum Zitat Wang, R., Lu, J., Lu, Y., Nie, F., & Li, X. (2022). Discrete and parameter-free multiple kernel \(k\)-means. IEEE Transactions on Image Processing, 31, 2796–2808.CrossRef Wang, R., Lu, J., Lu, Y., Nie, F., & Li, X. (2022). Discrete and parameter-free multiple kernel \(k\)-means. IEEE Transactions on Image Processing, 31, 2796–2808.CrossRef
Zurück zum Zitat Wang, Z., Nie, F., Tian, L., Wang, R., & Li, X. (2020). Discriminative feature selection via a structured sparse subspace learning module. In IJCAI, pp. 3009–3015 Wang, Z., Nie, F., Tian, L., Wang, R., & Li, X. (2020). Discriminative feature selection via a structured sparse subspace learning module. In IJCAI, pp. 3009–3015
Zurück zum Zitat Wu, T., Xiao, Y., Guo, M., & Nie, F. (2020). A general framework for dimensionality reduction of \(k\)-means clustering. Journal of Classification, 37(3), 616–631.MathSciNetCrossRef Wu, T., Xiao, Y., Guo, M., & Nie, F. (2020). A general framework for dimensionality reduction of \(k\)-means clustering. Journal of Classification, 37(3), 616–631.MathSciNetCrossRef
Zurück zum Zitat Ya, Geng, Li, Q., Liang, M., Chi, C. Y., Tan, J., & Huang, H. (2020). Local-density subspace distributed clustering for high-dimensional data. IEEE Transactions on Parallel and Distributed Systems, 31(8), 1799–1814.CrossRef Ya, Geng, Li, Q., Liang, M., Chi, C. Y., Tan, J., & Huang, H. (2020). Local-density subspace distributed clustering for high-dimensional data. IEEE Transactions on Parallel and Distributed Systems, 31(8), 1799–1814.CrossRef
Zurück zum Zitat Yan, H., Liu, S., & Yu, P. S. (2019). From joint feature selection and self-representation learning to robust multi-view subspace clustering. In Proceedings of the IEEE International Conference on Data Mining, pp. 1414–1419 Yan, H., Liu, S., & Yu, P. S. (2019). From joint feature selection and self-representation learning to robust multi-view subspace clustering. In Proceedings of the IEEE International Conference on Data Mining, pp. 1414–1419
Zurück zum Zitat Yang, X., Li, S., Liang, K., Nie, F., & Lin, L. (2022). Structured graph optimization for joint spectral embedding and clustering. Neurocomputing, 503, 62–72.CrossRef Yang, X., Li, S., Liang, K., Nie, F., & Lin, L. (2022). Structured graph optimization for joint spectral embedding and clustering. Neurocomputing, 503, 62–72.CrossRef
Zurück zum Zitat Yang, Y., Shen, H. T., Ma, Z., Huang, Z., & Zhou, X. (2011). \(\ell _{2,1}\)-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1589–1594 Yang, Y., Shen, H. T., Ma, Z., Huang, Z., & Zhou, X. (2011). \(\ell _{2,1}\)-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1589–1594
Zurück zum Zitat Yi, Y., Wang, J., Zhou, W., Zheng, C., Kong, J., & Qiao, S. (2020). Non-negative matrix factorization with locality constrained adaptive graph. IEEE Transactions on circuits and systems for video technology, 30(2), 427–441.CrossRef Yi, Y., Wang, J., Zhou, W., Zheng, C., Kong, J., & Qiao, S. (2020). Non-negative matrix factorization with locality constrained adaptive graph. IEEE Transactions on circuits and systems for video technology, 30(2), 427–441.CrossRef
Zurück zum Zitat Zhan, K., Nie, F., Wang, J., & Yang, Y. (2019). Multiview consensus graph clustering. IEEE Transactions on Image Processing, 28(3), 1261–1270.MathSciNetCrossRef Zhan, K., Nie, F., Wang, J., & Yang, Y. (2019). Multiview consensus graph clustering. IEEE Transactions on Image Processing, 28(3), 1261–1270.MathSciNetCrossRef
Zurück zum Zitat Zhang, X., Yao, Y., Wang, M., Shen, J., Feng, L., & Hu, B. (2017). Normalized mutual information feature selection for electroencephalogram data based on grassberger entropy estimator. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine, pp. 648–652 Zhang, X., Yao, Y., Wang, M., Shen, J., Feng, L., & Hu, B. (2017). Normalized mutual information feature selection for electroencephalogram data based on grassberger entropy estimator. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine, pp. 648–652
Zurück zum Zitat Zhang, Y., & Cai, J. (2021). Fuzzy clustering based on automated feature pattern-driven similarity matrix reduction. IEEE Transactions on Computational Social Systems, 8(5), 1203–1212.CrossRef Zhang, Y., & Cai, J. (2021). Fuzzy clustering based on automated feature pattern-driven similarity matrix reduction. IEEE Transactions on Computational Social Systems, 8(5), 1203–1212.CrossRef
Metadaten
Titel
Soft Label Guided Unsupervised Discriminative Sparse Subspace Feature Selection
verfasst von
Keding Chen
Yong Peng
Feiping Nie
Wanzeng Kong
Publikationsdatum
25.01.2024
Verlag
Springer US
Erschienen in
Journal of Classification / Ausgabe 1/2024
Print ISSN: 0176-4268
Elektronische ISSN: 1432-1343
DOI
https://doi.org/10.1007/s00357-024-09462-6

Weitere Artikel der Ausgabe 1/2024

Journal of Classification 1/2024 Zur Ausgabe

Premium Partner