Skip to main content
Top
Published in:

25-01-2024

Soft Label Guided Unsupervised Discriminative Sparse Subspace Feature Selection

Authors: Keding Chen, Yong Peng, Feiping Nie, Wanzeng Kong

Published in: Journal of Classification | Issue 1/2024

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Feature selection and subspace learning are two primary methods to achieve data dimensionality reduction and discriminability enhancement. However, data label information is unavailable in unsupervised learning to guide the dimensionality reduction process. To this end, we propose a soft label guided unsupervised discriminative sparse subspace feature selection (UDS\(^2\)FS) model in this paper, which consists of two superiorities in comparison with the existing studies. On the one hand, UDS\(^2\)FS aims to find a discriminative subspace to simultaneously maximize the between-class data scatter and minimize the within-class scatter. On the other hand, UDS\(^2\)FS estimates the data label information in the learned subspace, which further serves as the soft labels to guide the discriminative subspace learning process. Moreover, the \(\ell _{2,0}\)-norm is imposed to achieve row sparsity of the subspace projection matrix, which is parameter-free and more stable compared to the \(\ell _{2,1}\)-norm. Experimental studies to evaluate the performance of UDS\(^2\)FS are performed from three aspects, i.e., a synthetic data set to check its iterative optimization process, several toy data sets to visualize the feature selection effect, and some benchmark data sets to examine the clustering performance of UDS\(^2\)FS. From the obtained results, UDS\(^2\)FS exhibits competitive performance in joint subspace learning and feature selection in comparison with some related models.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
go back to reference Aibinu, A., Salau, H. B., Rahman, N. A., Nwohu, M., & Akachukwu, C. (2016). A novel clustering based genetic algorithm for route optimization. Engineering Science and Technology, an International Journal, 19(4), 2022–2034.CrossRef Aibinu, A., Salau, H. B., Rahman, N. A., Nwohu, M., & Akachukwu, C. (2016). A novel clustering based genetic algorithm for route optimization. Engineering Science and Technology, an International Journal, 19(4), 2022–2034.CrossRef
go back to reference Ayar, M., Isazadeh, A., Gharehchopogh, F. S., & Seyedi, M. (2022). Chaotic-based divide-and-conquer feature selection method and its application in cardiac arrhythmia classification. The Journal of Supercomputing, 78, 5856–5882.CrossRef Ayar, M., Isazadeh, A., Gharehchopogh, F. S., & Seyedi, M. (2022). Chaotic-based divide-and-conquer feature selection method and its application in cardiac arrhythmia classification. The Journal of Supercomputing, 78, 5856–5882.CrossRef
go back to reference Brahim, A. B., & Limam, M. (2018). Ensemble feature selection for high dimensional data: A new method and a comparative study. Advances in Data Analysis and Classification, 12(4), 937–952.MathSciNetCrossRef Brahim, A. B., & Limam, M. (2018). Ensemble feature selection for high dimensional data: A new method and a comparative study. Advances in Data Analysis and Classification, 12(4), 937–952.MathSciNetCrossRef
go back to reference Cai, X., Nie, F., & Huang, H. (2013). Exact top-$k$ feature selection via \(\ell \)2,0-norm constraint. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1240–1246 Cai, X., Nie, F., & Huang, H. (2013). Exact top-$k$ feature selection via \(\ell \)2,0-norm constraint. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1240–1246
go back to reference Chao, G., Sun, S., & Bi, J. (2021). A survey on multiview clustering. IEEE Transactions on Artificial Intelligence, 2(2), 146–168.CrossRef Chao, G., Sun, S., & Bi, J. (2021). A survey on multiview clustering. IEEE Transactions on Artificial Intelligence, 2(2), 146–168.CrossRef
go back to reference Chen, X., Yuan, G., Nie, F., & Huang, Z. J. (2017). Semi-supervised feature selection via rescaled linear regression. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1525–1531 Chen, X., Yuan, G., Nie, F., & Huang, Z. J. (2017). Semi-supervised feature selection via rescaled linear regression. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1525–1531
go back to reference Chen, X., Hong, W., Nie, F., He, D., Yang, M., & Huang, J. Z. (2018). Spectral clustering of large-scale data by directly solving normalized cut. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1206–1215 Chen, X., Hong, W., Nie, F., He, D., Yang, M., & Huang, J. Z. (2018). Spectral clustering of large-scale data by directly solving normalized cut. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1206–1215
go back to reference Chen, X., Yuan, G., Nie, F., & Ming, Z. (2020). Semi-supervised feature selection via sparse rescaled linear square regression. IEEE Transactions on Knowledge and Data Engineering, 32(1), 165–176.CrossRef Chen, X., Yuan, G., Nie, F., & Ming, Z. (2020). Semi-supervised feature selection via sparse rescaled linear square regression. IEEE Transactions on Knowledge and Data Engineering, 32(1), 165–176.CrossRef
go back to reference Deng, P., Li, T., Wang, H., Wang, D., Horng, S. J., & Liu, R. (2023). Graph regularized sparse non-negative matrix factorization for clustering. IEEE Transactions on Computational Social Systems, 10(3), 910–921.CrossRef Deng, P., Li, T., Wang, H., Wang, D., Horng, S. J., & Liu, R. (2023). Graph regularized sparse non-negative matrix factorization for clustering. IEEE Transactions on Computational Social Systems, 10(3), 910–921.CrossRef
go back to reference Fang, X., Hu, Y., Zhou, P., & Wu, D. (2022). ANIMC: A soft approach for autoweighted noisy and incomplete multiview clustering. IEEE Transactions on Artificial Intelligence, 3(2), 192–206.CrossRef Fang, X., Hu, Y., Zhou, P., & Wu, D. (2022). ANIMC: A soft approach for autoweighted noisy and incomplete multiview clustering. IEEE Transactions on Artificial Intelligence, 3(2), 192–206.CrossRef
go back to reference Gharehchopogh, F. S. (2023). Quantum-inspired metaheuristic algorithms: Comprehensive survey and classification. Artificial Intelligence Review, 56(6), 5479–5543.CrossRef Gharehchopogh, F. S. (2023). Quantum-inspired metaheuristic algorithms: Comprehensive survey and classification. Artificial Intelligence Review, 56(6), 5479–5543.CrossRef
go back to reference Gharehchopogh, F. S., & Khargoush, A. A. (2023). A chaotic-based interactive autodidactic school algorithm for data clustering problems and its application on COVID-19 disease detection. Symmetry, 15(4), 894.CrossRef Gharehchopogh, F. S., & Khargoush, A. A. (2023). A chaotic-based interactive autodidactic school algorithm for data clustering problems and its application on COVID-19 disease detection. Symmetry, 15(4), 894.CrossRef
go back to reference Gharehchopogh, F. S., Namazi, M., Ebrahimi, L., & Abdollahzadeh, B. (2023). Advances in sparrow search algorithm: A comprehensive survey. Archives of Computational Methods in Engineering, 30(1), 427–455.CrossRef Gharehchopogh, F. S., Namazi, M., Ebrahimi, L., & Abdollahzadeh, B. (2023). Advances in sparrow search algorithm: A comprehensive survey. Archives of Computational Methods in Engineering, 30(1), 427–455.CrossRef
go back to reference Gharehchopogh, F. S., Ucan, A., Ibrikci, T., Arasteh, B., & Isik, G. (2023). Slime mould algorithm: A comprehensive survey of its variants and applications. Archives of Computational Methods in Engineering, 30(4), 2683–2723.CrossRef Gharehchopogh, F. S., Ucan, A., Ibrikci, T., Arasteh, B., & Isik, G. (2023). Slime mould algorithm: A comprehensive survey of its variants and applications. Archives of Computational Methods in Engineering, 30(4), 2683–2723.CrossRef
go back to reference Greenlaw, R., & Kantabutra, S. (2013). Survey of clustering: Algorithms and applications. International Journal of Information Retrieval Research, 3, 1–29.CrossRef Greenlaw, R., & Kantabutra, S. (2013). Survey of clustering: Algorithms and applications. International Journal of Information Retrieval Research, 3, 1–29.CrossRef
go back to reference Gu, Q., Li, Z., & Han, J. (2011). Joint feature selection and subspace learning. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1294–1299 Gu, Q., Li, Z., & Han, J. (2011). Joint feature selection and subspace learning. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1294–1299
go back to reference He, X., & Niyogi P (2003). Locality preserving projections. In Proceedings of the Advances in Neural Information Processing Systems, pp. 153–160 He, X., & Niyogi P (2003). Locality preserving projections. In Proceedings of the Advances in Neural Information Processing Systems, pp. 153–160
go back to reference Hou, C., Nie, F., Yi, D., & Tao, D. (2014). Discriminative embedded clustering: A framework for grouping high-dimensional data. IEEE Transactions on Neural Networks and Learning Systems, 26(6), 1287–1299.MathSciNet Hou, C., Nie, F., Yi, D., & Tao, D. (2014). Discriminative embedded clustering: A framework for grouping high-dimensional data. IEEE Transactions on Neural Networks and Learning Systems, 26(6), 1287–1299.MathSciNet
go back to reference Hu, B., Dai, Y., Su, Y., Moore, P., Zhang, X., Mao, C., Chen, J., & Xu, L. (2018). Feature selection for optimized high-dimensional biomedical data using an improved shuffled frog leaping algorithm. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 15(6), 1765–1773.CrossRef Hu, B., Dai, Y., Su, Y., Moore, P., Zhang, X., Mao, C., Chen, J., & Xu, L. (2018). Feature selection for optimized high-dimensional biomedical data using an improved shuffled frog leaping algorithm. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 15(6), 1765–1773.CrossRef
go back to reference Huang, J., Nie, F., & Huang, H. (2015). A new simplex sparse learning model to measure data similarity for clustering. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 3569–3575 Huang, J., Nie, F., & Huang, H. (2015). A new simplex sparse learning model to measure data similarity for clustering. In Proceedings of the International Joint Conference on Artificial Intelligence, pp. 3569–3575
go back to reference Hubert, L., & Arabie, P. (1985). Comparing partitions. Journal of Classification, 2, 193–218.CrossRef Hubert, L., & Arabie, P. (1985). Comparing partitions. Journal of Classification, 2, 193–218.CrossRef
go back to reference Jia, Y., Nie, F., & Zhang, C. (2009). Trace ratio problem revisited. IEEE Transactions on Neural Networks, 20(4), 729–735.CrossRef Jia, Y., Nie, F., & Zhang, C. (2009). Trace ratio problem revisited. IEEE Transactions on Neural Networks, 20(4), 729–735.CrossRef
go back to reference Kamali, T., & Stashuk, D. W. (2020). Discovering density-based clustering structures using neighborhood distance entropy consistency. IEEE Transactions on Computational Social Systems, 7(4), 1069–1080.CrossRef Kamali, T., & Stashuk, D. W. (2020). Discovering density-based clustering structures using neighborhood distance entropy consistency. IEEE Transactions on Computational Social Systems, 7(4), 1069–1080.CrossRef
go back to reference Li, X., Jing, Z., Hu, B., Zhu, J., Zhong, N., Li, M., Ding, Z., Yang, J., Zhang, L., & Feng, L. (2017). Majoe D (2017) A resting-state brain functional network study in MDD based on minimum spanning tree analysis and the hierarchical clustering. Complexity, 9514369, 1–11. Li, X., Jing, Z., Hu, B., Zhu, J., Zhong, N., Li, M., Ding, Z., Yang, J., Zhang, L., & Feng, L. (2017). Majoe D (2017) A resting-state brain functional network study in MDD based on minimum spanning tree analysis and the hierarchical clustering. Complexity, 9514369, 1–11.
go back to reference Li, Z., & Tang, J. (2015). Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Transactions on Image Processing, 24(12), 5343–5355.MathSciNetCrossRef Li, Z., & Tang, J. (2015). Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Transactions on Image Processing, 24(12), 5343–5355.MathSciNetCrossRef
go back to reference Li, Z., Yang, Y., Liu, J., Zhou, X., & Lu, H. (2012). Unsupervised feature selection using nonnegative spectral analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1026–1032 Li, Z., Yang, Y., Liu, J., Zhou, X., & Lu, H. (2012). Unsupervised feature selection using nonnegative spectral analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1026–1032
go back to reference Liyanage, Y. W., Zois, D. S., & Chelmis, C. (2021). Dynamic instance-wise joint feature selection and classification. IEEE Transactions on Artificial Intelligence, 2(2), 169–184.CrossRef Liyanage, Y. W., Zois, D. S., & Chelmis, C. (2021). Dynamic instance-wise joint feature selection and classification. IEEE Transactions on Artificial Intelligence, 2(2), 169–184.CrossRef
go back to reference Nie, F., Huang, H., Cai, X., & Ding, C. (2010). Efficient and robust feature selection via joint \(\ell \)2,1-norms minimization. In Proceedings of the Advances in Neural Information processing Systems, pp 1813–1821 Nie, F., Huang, H., Cai, X., & Ding, C. (2010). Efficient and robust feature selection via joint \(\ell \)2,1-norms minimization. In Proceedings of the Advances in Neural Information processing Systems, pp 1813–1821
go back to reference Nie, F., Zhang, R., & Li, X. (2017). A generalized power iteration method for solving quadratic problem on the Stiefel manifold. Science China Information Sciences, 60(11), 146–155.MathSciNetCrossRef Nie, F., Zhang, R., & Li, X. (2017). A generalized power iteration method for solving quadratic problem on the Stiefel manifold. Science China Information Sciences, 60(11), 146–155.MathSciNetCrossRef
go back to reference Nie, F., Dong, X., & Li, X. (2021). Unsupervised and semisupervised projection with graph optimization. IEEE Transactions on Neural Networks and Learning Systems, 32(4), 1547–1559.MathSciNetCrossRef Nie, F., Dong, X., & Li, X. (2021). Unsupervised and semisupervised projection with graph optimization. IEEE Transactions on Neural Networks and Learning Systems, 32(4), 1547–1559.MathSciNetCrossRef
go back to reference Nie, F., Wang, Z., Tian, L., Wang, R., & Li, X. (2022). Subspace sparse discriminative feature selection. IEEE Transactions on Cybernetics, 52(6), 4221–4223.CrossRef Nie, F., Wang, Z., Tian, L., Wang, R., & Li, X. (2022). Subspace sparse discriminative feature selection. IEEE Transactions on Cybernetics, 52(6), 4221–4223.CrossRef
go back to reference Nie, F., Zhao, X., Wang, R., & Li, X. (2023). Adaptive maximum entropy graph-guided fast local discriminant analysis. IEEE Transactions on Cybernetics, 53(6), 3574–3587.CrossRef Nie, F., Zhao, X., Wang, R., & Li, X. (2023). Adaptive maximum entropy graph-guided fast local discriminant analysis. IEEE Transactions on Cybernetics, 53(6), 3574–3587.CrossRef
go back to reference Pappu, V., & Pardalos, P. M. (2014). High-dimensional data classification. In Clusters, Orders, and Trees: Methods and Applications, Springer, pp. 119–150 Pappu, V., & Pardalos, P. M. (2014). High-dimensional data classification. In Clusters, Orders, and Trees: Methods and Applications, Springer, pp. 119–150
go back to reference Parida, P., & Bhoi, N. (2018). Fuzzy clustering based transition region extraction for image segmentation. Engineering Science and Technology, an International Journal, 21(4), 547–563.CrossRef Parida, P., & Bhoi, N. (2018). Fuzzy clustering based transition region extraction for image segmentation. Engineering Science and Technology, an International Journal, 21(4), 547–563.CrossRef
go back to reference Peng, X., Yu, Z., Zhang, Y., & Tang, H. (2017). Constructing the L2-graph for robust subspace learning and subspace clustering. IEEE Transactions on Cybernetics, 47(4), 1053–1066.CrossRef Peng, X., Yu, Z., Zhang, Y., & Tang, H. (2017). Constructing the L2-graph for robust subspace learning and subspace clustering. IEEE Transactions on Cybernetics, 47(4), 1053–1066.CrossRef
go back to reference Peng, Y., Li, Q., Kong, W., Qin, F., Zhang, J., & Cichocki, A. (2020). A joint optimization framework to semi-supervised RVFL and ELM networks for efficient data classification. Applied Soft Computing, 97(106756), 1–15. Peng, Y., Li, Q., Kong, W., Qin, F., Zhang, J., & Cichocki, A. (2020). A joint optimization framework to semi-supervised RVFL and ELM networks for efficient data classification. Applied Soft Computing, 97(106756), 1–15.
go back to reference Peng, Y., Qin, F., Kong, W., Ge, Y., Nie, F., & Cichocki, A. (2022). GFIL: A unified framework for the importance analysis of features, frequency bands and channels in EEG-based emotion recognition. IEEE Transactions on Cognitive and Developmental Systems, 14(3), 935–947.CrossRef Peng, Y., Qin, F., Kong, W., Ge, Y., Nie, F., & Cichocki, A. (2022). GFIL: A unified framework for the importance analysis of features, frequency bands and channels in EEG-based emotion recognition. IEEE Transactions on Cognitive and Developmental Systems, 14(3), 935–947.CrossRef
go back to reference Piri, J., Mohapatra, P., Acharya, B., Gharehchopogh, F. S., Gerogiannis, V. C., Kanavos, A., & Manika, S. (2022). Feature selection using artificial gorilla troop optimization for biomedical data: A case analysis with COVID-19 data. Mathematics, 10(15), 2742.CrossRef Piri, J., Mohapatra, P., Acharya, B., Gharehchopogh, F. S., Gerogiannis, V. C., Kanavos, A., & Manika, S. (2022). Feature selection using artificial gorilla troop optimization for biomedical data: A case analysis with COVID-19 data. Mathematics, 10(15), 2742.CrossRef
go back to reference Rand, W. M. (1971). Objective criteria for the evaluation of clustering methods. Journal of the American Statistical Association, 66(336), 846–850.CrossRef Rand, W. M. (1971). Objective criteria for the evaluation of clustering methods. Journal of the American Statistical Association, 66(336), 846–850.CrossRef
go back to reference Song, P., & Zheng, W. (2020). Feature selection based transfer subspace learning for speech emotion recognition. IEEE Transactions on Affective Computing, 11(3), 373–382.CrossRef Song, P., & Zheng, W. (2020). Feature selection based transfer subspace learning for speech emotion recognition. IEEE Transactions on Affective Computing, 11(3), 373–382.CrossRef
go back to reference Sun, Y., Babu, P., & Palomar, D. P. (2016). Majorization-minimization algorithms in signal processing, communications, and machine learning. IEEE Transactions on Signal Processing, 65(3), 794–816.MathSciNetCrossRef Sun, Y., Babu, P., & Palomar, D. P. (2016). Majorization-minimization algorithms in signal processing, communications, and machine learning. IEEE Transactions on Signal Processing, 65(3), 794–816.MathSciNetCrossRef
go back to reference Vu, V. Q., & Lei, J. (2013). Minimax sparse principal subspace estimation in high dimensions. Annals of Statistics, 41(6), 2905–2947.MathSciNetCrossRef Vu, V. Q., & Lei, J. (2013). Minimax sparse principal subspace estimation in high dimensions. Annals of Statistics, 41(6), 2905–2947.MathSciNetCrossRef
go back to reference Wang, K., He, R., Wang, L., Wang, W., & Tan, T. (2015). Joint feature selection and subspace learning for cross-modal retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(10), 2010–2023.CrossRef Wang, K., He, R., Wang, L., Wang, W., & Tan, T. (2015). Joint feature selection and subspace learning for cross-modal retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(10), 2010–2023.CrossRef
go back to reference Wang, R., Bian, J., Nie, F., & Li, X. (2022). Unsupervised discriminative projection for feature selection. IEEE Transactions on Knowledge and Data Engineering, 34(2), 942–952.CrossRef Wang, R., Bian, J., Nie, F., & Li, X. (2022). Unsupervised discriminative projection for feature selection. IEEE Transactions on Knowledge and Data Engineering, 34(2), 942–952.CrossRef
go back to reference Wang, R., Lu, J., Lu, Y., Nie, F., & Li, X. (2022). Discrete and parameter-free multiple kernel \(k\)-means. IEEE Transactions on Image Processing, 31, 2796–2808.CrossRef Wang, R., Lu, J., Lu, Y., Nie, F., & Li, X. (2022). Discrete and parameter-free multiple kernel \(k\)-means. IEEE Transactions on Image Processing, 31, 2796–2808.CrossRef
go back to reference Wang, Z., Nie, F., Tian, L., Wang, R., & Li, X. (2020). Discriminative feature selection via a structured sparse subspace learning module. In IJCAI, pp. 3009–3015 Wang, Z., Nie, F., Tian, L., Wang, R., & Li, X. (2020). Discriminative feature selection via a structured sparse subspace learning module. In IJCAI, pp. 3009–3015
go back to reference Wu, T., Xiao, Y., Guo, M., & Nie, F. (2020). A general framework for dimensionality reduction of \(k\)-means clustering. Journal of Classification, 37(3), 616–631.MathSciNetCrossRef Wu, T., Xiao, Y., Guo, M., & Nie, F. (2020). A general framework for dimensionality reduction of \(k\)-means clustering. Journal of Classification, 37(3), 616–631.MathSciNetCrossRef
go back to reference Ya, Geng, Li, Q., Liang, M., Chi, C. Y., Tan, J., & Huang, H. (2020). Local-density subspace distributed clustering for high-dimensional data. IEEE Transactions on Parallel and Distributed Systems, 31(8), 1799–1814.CrossRef Ya, Geng, Li, Q., Liang, M., Chi, C. Y., Tan, J., & Huang, H. (2020). Local-density subspace distributed clustering for high-dimensional data. IEEE Transactions on Parallel and Distributed Systems, 31(8), 1799–1814.CrossRef
go back to reference Yan, H., Liu, S., & Yu, P. S. (2019). From joint feature selection and self-representation learning to robust multi-view subspace clustering. In Proceedings of the IEEE International Conference on Data Mining, pp. 1414–1419 Yan, H., Liu, S., & Yu, P. S. (2019). From joint feature selection and self-representation learning to robust multi-view subspace clustering. In Proceedings of the IEEE International Conference on Data Mining, pp. 1414–1419
go back to reference Yang, X., Li, S., Liang, K., Nie, F., & Lin, L. (2022). Structured graph optimization for joint spectral embedding and clustering. Neurocomputing, 503, 62–72.CrossRef Yang, X., Li, S., Liang, K., Nie, F., & Lin, L. (2022). Structured graph optimization for joint spectral embedding and clustering. Neurocomputing, 503, 62–72.CrossRef
go back to reference Yang, Y., Shen, H. T., Ma, Z., Huang, Z., & Zhou, X. (2011). \(\ell _{2,1}\)-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1589–1594 Yang, Y., Shen, H. T., Ma, Z., Huang, Z., & Zhou, X. (2011). \(\ell _{2,1}\)-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of the International Joint Conference on Artificial Intelligence, pp. 1589–1594
go back to reference Yi, Y., Wang, J., Zhou, W., Zheng, C., Kong, J., & Qiao, S. (2020). Non-negative matrix factorization with locality constrained adaptive graph. IEEE Transactions on circuits and systems for video technology, 30(2), 427–441.CrossRef Yi, Y., Wang, J., Zhou, W., Zheng, C., Kong, J., & Qiao, S. (2020). Non-negative matrix factorization with locality constrained adaptive graph. IEEE Transactions on circuits and systems for video technology, 30(2), 427–441.CrossRef
go back to reference Zhan, K., Nie, F., Wang, J., & Yang, Y. (2019). Multiview consensus graph clustering. IEEE Transactions on Image Processing, 28(3), 1261–1270.MathSciNetCrossRef Zhan, K., Nie, F., Wang, J., & Yang, Y. (2019). Multiview consensus graph clustering. IEEE Transactions on Image Processing, 28(3), 1261–1270.MathSciNetCrossRef
go back to reference Zhang, X., Yao, Y., Wang, M., Shen, J., Feng, L., & Hu, B. (2017). Normalized mutual information feature selection for electroencephalogram data based on grassberger entropy estimator. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine, pp. 648–652 Zhang, X., Yao, Y., Wang, M., Shen, J., Feng, L., & Hu, B. (2017). Normalized mutual information feature selection for electroencephalogram data based on grassberger entropy estimator. In Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine, pp. 648–652
go back to reference Zhang, Y., & Cai, J. (2021). Fuzzy clustering based on automated feature pattern-driven similarity matrix reduction. IEEE Transactions on Computational Social Systems, 8(5), 1203–1212.CrossRef Zhang, Y., & Cai, J. (2021). Fuzzy clustering based on automated feature pattern-driven similarity matrix reduction. IEEE Transactions on Computational Social Systems, 8(5), 1203–1212.CrossRef
Metadata
Title
Soft Label Guided Unsupervised Discriminative Sparse Subspace Feature Selection
Authors
Keding Chen
Yong Peng
Feiping Nie
Wanzeng Kong
Publication date
25-01-2024
Publisher
Springer US
Published in
Journal of Classification / Issue 1/2024
Print ISSN: 0176-4268
Electronic ISSN: 1432-1343
DOI
https://doi.org/10.1007/s00357-024-09462-6

Premium Partner