Skip to main content
Erschienen in: Neural Processing Letters 4/2022

05.03.2021

A Robust Cost-Sensitive Feature Selection Via Self-Paced Learning Regularization

verfasst von: Yangding Li, Chaoqun Ma, Yiling Tao, Zehui Hu, Zidong Su, Meiling Liu

Erschienen in: Neural Processing Letters | Ausgabe 4/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Feature selection is a useful and important process, which has a widely use in high-dimensional data processing and artificial intelligence. Its goal is to select a relatively small and representative subset of data from the original data space so that can obtain a better learning performance. Many existing feature selection algorithms simply pursue high accuracy and ignore the cost of feature acquisition and error classification. In this paper, a novel cost-sensitive feature selection method via self-paced learning is proposed. The \(\sigma \)-norm is introduced to constrain the model and enhance its robustness. Then, we combine the self-paced learning framework. It can control the number of samples involved in training process, which can reduce the impact of noise points. The proposed method can obtain a better classification accuracy while maintaining the lowest total cost. It also has better interpretability and practicability than the traditional feature selection algorithm due to the consideration of the collection and misclassification cost among various features. Extensive experiments have been conducted on eight data sets with six comparison algorithms. The proposed method achieves a good performance.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Yimin Z, Ling T, Ce Z, Xin J, Sun Yu (2020) Video coding optimization for virtual reality 360-degree source. J. Sel. Topics Signal Processing 14(1):118–129CrossRef Yimin Z, Ling T, Ce Z, Xin J, Sun Yu (2020) Video coding optimization for virtual reality 360-degree source. J. Sel. Topics Signal Processing 14(1):118–129CrossRef
2.
Zurück zum Zitat Zhao K, Haiqi P, Hoi Steven CH, Zenglin X (2020) Robust graph learning from noisy data. IEEE Transactions on Cybernetics 50(5):1833–1843CrossRef Zhao K, Haiqi P, Hoi Steven CH, Zenglin X (2020) Robust graph learning from noisy data. IEEE Transactions on Cybernetics 50(5):1833–1843CrossRef
3.
Zurück zum Zitat Yu Zhang Yu, Jin WJ, Xingyu W (2017) Sparse bayesian learning for obtaining sparsity of eeg frequency bands based feature vectors in motor imagery classification. International journal of neural systems 27(02):1650032CrossRef Yu Zhang Yu, Jin WJ, Xingyu W (2017) Sparse bayesian learning for obtaining sparsity of eeg frequency bands based feature vectors in motor imagery classification. International journal of neural systems 27(02):1650032CrossRef
4.
Zurück zum Zitat Benson M, Siva TT, Soares Jair C (2014) A review of feature reduction techniques in neuroimaging. Neuroinformatics 12(2):229–244CrossRef Benson M, Siva TT, Soares Jair C (2014) A review of feature reduction techniques in neuroimaging. Neuroinformatics 12(2):229–244CrossRef
7.
Zurück zum Zitat Zhao K, Xinjia Z, Shi, Chong P, Hongyuan Z, Joey TZ, Xi P, Wenyu C, Zenglin X (2020) Partition level multiview subspace clustering. Neural Networks 122:279–288CrossRef Zhao K, Xinjia Z, Shi, Chong P, Hongyuan Z, Joey TZ, Xi P, Wenyu C, Zenglin X (2020) Partition level multiview subspace clustering. Neural Networks 122:279–288CrossRef
8.
Zurück zum Zitat Jundong L, Kewei C, Suhang W, Fred M, Trevino Robert P, Jiliang T, Huan L (2017) Feature selection: A data perspective. ACM Computing Surveys (CSUR) 50(6):1–45 Jundong L, Kewei C, Suhang W, Fred M, Trevino Robert P, Jiliang T, Huan L (2017) Feature selection: A data perspective. ACM Computing Surveys (CSUR) 50(6):1–45
9.
Zurück zum Zitat Feiping N, Wei Z, Xuelong L (2016) Unsupervised feature selection with structured graph optimization. Proceedings of the Thirtieth AAAI conference on artificial intelligence 30(1):1302–1308 Feiping N, Wei Z, Xuelong L (2016) Unsupervised feature selection with structured graph optimization. Proceedings of the Thirtieth AAAI conference on artificial intelligence 30(1):1302–1308
11.
Zurück zum Zitat Jie C, Jiawei L, Shulin W, Sheng Y (2018) Feature selection in machine learning: A new perspective. Neurocomputing 300:70–79CrossRef Jie C, Jiawei L, Shulin W, Sheng Y (2018) Feature selection in machine learning: A new perspective. Neurocomputing 300:70–79CrossRef
12.
Zurück zum Zitat Shiping W, Witold P, Qingxin Z, William Z (2015) Subspace learning for unsupervised feature selection via matrix factorization. Pattern Recognition 48(1):10–19MATHCrossRef Shiping W, Witold P, Qingxin Z, William Z (2015) Subspace learning for unsupervised feature selection via matrix factorization. Pattern Recognition 48(1):10–19MATHCrossRef
14.
Zurück zum Zitat Jianhua D, Bingjie W, Xiaohong Z, Qilai Z (2017) Uncertainty measurement for incomplete interval-valued information systems based on \(\alpha \)-weak similarity. Knowledge-Based Systems 136:159–171 Jianhua D, Bingjie W, Xiaohong Z, Qilai Z (2017) Uncertainty measurement for incomplete interval-valued information systems based on \(\alpha \)-weak similarity. Knowledge-Based Systems 136:159–171
15.
Zurück zum Zitat Jiliang T, Salem A, Huan L (2014) Feature selection for classification: A review. Algorithms and applications, Data classification, p 37MATH Jiliang T, Salem A, Huan L (2014) Feature selection for classification: A review. Algorithms and applications, Data classification, p 37MATH
16.
Zurück zum Zitat Qinghua H, Lingjun Z, Yucan Z, Witold P (2017) Large-scale multimodality attribute reduction with multi-kernel fuzzy rough sets. IEEE Transactions on Fuzzy Systems 26(1):226–238 Qinghua H, Lingjun Z, Yucan Z, Witold P (2017) Large-scale multimodality attribute reduction with multi-kernel fuzzy rough sets. IEEE Transactions on Fuzzy Systems 26(1):226–238
17.
Zurück zum Zitat Feiping N, Heng H, Xiao C, Ding Chris H (2010) Efficient and robust feature selection via joint l2, 1-norms minimization. Advances in neural information processing systems 23:1813–1821 Feiping N, Heng H, Xiao C, Ding Chris H (2010) Efficient and robust feature selection via joint l2, 1-norms minimization. Advances in neural information processing systems 23:1813–1821
18.
Zurück zum Zitat Zizhu F, Yong X, David Z (2011) Local linear discriminant analysis framework using sample neighbors. IEEE Transactions on Neural Networks 22(7):1119–1132CrossRef Zizhu F, Yong X, David Z (2011) Local linear discriminant analysis framework using sample neighbors. IEEE Transactions on Neural Networks 22(7):1119–1132CrossRef
19.
Zurück zum Zitat Liang S, Shuiwang J, Jieping Y (2010) Canonical correlation analysis for multilabel classification: A least-squares formulation, extensions, and analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 33(1):194–200CrossRef Liang S, Shuiwang J, Jieping Y (2010) Canonical correlation analysis for multilabel classification: A least-squares formulation, extensions, and analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 33(1):194–200CrossRef
20.
Zurück zum Zitat Paweł T, Damien Z, Marta S (2019) Cost-sensitive classifier chains: Selecting low-cost features in multi-label classification. Pattern Recognition 86:290–319CrossRef Paweł T, Damien Z, Marta S (2019) Cost-sensitive classifier chains: Selecting low-cost features in multi-label classification. Pattern Recognition 86:290–319CrossRef
21.
Zurück zum Zitat Shi Y, Miao J, Wang Z, Zhang P, Niu L (2018) Feature selection with l2,1–2 regularization. IEEE Transactions on Neural Networks and Learning Systems 29(10):4967–4982MathSciNetCrossRef Shi Y, Miao J, Wang Z, Zhang P, Niu L (2018) Feature selection with l2,1–2 regularization. IEEE Transactions on Neural Networks and Learning Systems 29(10):4967–4982MathSciNetCrossRef
22.
Zurück zum Zitat Zechao L, Jing L, Yi Y, Xiaofang Z, Hanqing L (2013) Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Transactions on Knowledge and Data Engineering 26(9):2138–2150CrossRef Zechao L, Jing L, Yi Y, Xiaofang Z, Hanqing L (2013) Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Transactions on Knowledge and Data Engineering 26(9):2138–2150CrossRef
23.
24.
Zurück zum Zitat Ning B, Jun T, Jian-Huang L, Suen Ching Y (2018) High-dimensional supervised feature selection via optimized kernel mutual information. Expert Systems with Applications 108:81–95CrossRef Ning B, Jun T, Jian-Huang L, Suen Ching Y (2018) High-dimensional supervised feature selection via optimized kernel mutual information. Expert Systems with Applications 108:81–95CrossRef
25.
Zurück zum Zitat Yahong H, Yi Y, Yan Y, Zhigang M, Nicu S, Xiaofang Z (2014) Semisupervised feature selection via spline regression for video semantic recognition. IEEE Transactions on Neural Networks and Learning Systems 26(2):252–264MathSciNetCrossRef Yahong H, Yi Y, Yan Y, Zhigang M, Nicu S, Xiaofang Z (2014) Semisupervised feature selection via spline regression for video semantic recognition. IEEE Transactions on Neural Networks and Learning Systems 26(2):252–264MathSciNetCrossRef
26.
Zurück zum Zitat Zhao Z, Liu H (2007) Semi-supervised feature selection via spectral analysis. In Proceedings of the 2007 SIAM international conference on data mining, pages 641–646. SIAM Zhao Z, Liu H (2007) Semi-supervised feature selection via spectral analysis. In Proceedings of the 2007 SIAM international conference on data mining, pages 641–646. SIAM
27.
Zurück zum Zitat Hong Z, Shenglong Yu (2019) Cost-sensitive feature selection via the l2, 1-norm. International Journal of Approximate Reasoning 104:25–37MathSciNetMATHCrossRef Hong Z, Shenglong Yu (2019) Cost-sensitive feature selection via the l2, 1-norm. International Journal of Approximate Reasoning 104:25–37MathSciNetMATHCrossRef
28.
Zurück zum Zitat Xinwang L, Lei W, Jian Z, Jianping Y, Huan L (2013) Global and local structure preservation for feature selection. IEEE Transactions on Neural Networks and Learning Systems 25(6):1083–1095 Xinwang L, Lei W, Jian Z, Jianping Y, Huan L (2013) Global and local structure preservation for feature selection. IEEE Transactions on Neural Networks and Learning Systems 25(6):1083–1095
29.
Zurück zum Zitat Zhou Z, Xiaofei H, Deng C, Lijun Z, Wilfred N, Yueting Z (2015) Graph regularized feature selection with data reconstruction. IEEE Transactions on Knowledge and Data Engineering 28(3):689–700CrossRef Zhou Z, Xiaofei H, Deng C, Lijun Z, Wilfred N, Yueting Z (2015) Graph regularized feature selection with data reconstruction. IEEE Transactions on Knowledge and Data Engineering 28(3):689–700CrossRef
30.
Zurück zum Zitat Manizheh G, Mohammad-Reza F-D (2016) Feature selection using forest optimization algorithm. Pattern Recognition 60:121–129CrossRef Manizheh G, Mohammad-Reza F-D (2016) Feature selection using forest optimization algorithm. Pattern Recognition 60:121–129CrossRef
31.
Zurück zum Zitat Fan M, Qinghua H, William Z (2014) Feature selection with test cost constraint. International Journal of Approximate Reasonin 55(1):167–179MathSciNetMATHCrossRef Fan M, Qinghua H, William Z (2014) Feature selection with test cost constraint. International Journal of Approximate Reasonin 55(1):167–179MathSciNetMATHCrossRef
32.
Zurück zum Zitat Yael W, Yuval E, Lior R (2013) The cash algorithm-cost-sensitive attribute selection using histograms. Information Sciences 222:247–268MathSciNetCrossRef Yael W, Yuval E, Lior R (2013) The cash algorithm-cost-sensitive attribute selection using histograms. Information Sciences 222:247–268MathSciNetCrossRef
33.
Zurück zum Zitat Verónica B-C, Iago P-D, Noelia S-M, Amparo A-B (2014) A framework for cost-based feature selection. Pattern Recognition 47(7):2481–2489CrossRef Verónica B-C, Iago P-D, Noelia S-M, Amparo A-B (2014) A framework for cost-based feature selection. Pattern Recognition 47(7):2481–2489CrossRef
34.
Zurück zum Zitat Ying W, Jing B, Xin-guang P, Hai Z (2016) An efficient cost-sensitive feature selection using chaos genetic algorithm for class imbalance problem. Mathematical Problems in Engineering Theory Methods & Applications Ying W, Jing B, Xin-guang P, Hai Z (2016) An efficient cost-sensitive feature selection using chaos genetic algorithm for class imbalance problem. Mathematical Problems in Engineering Theory Methods & Applications
35.
Zurück zum Zitat Qifeng Z, Hao Z, Tao L (2016) Cost-sensitive feature selection using random forest: Selecting low-cost subsets of informative features. Knowledge-based systems 95:1–11CrossRef Qifeng Z, Hao Z, Tao L (2016) Cost-sensitive feature selection using random forest: Selecting low-cost subsets of informative features. Knowledge-based systems 95:1–11CrossRef
36.
Zurück zum Zitat Rui Z, Feiping N, Muhan G, Xian W, Xuelong L (2018) Joint learning of fuzzy k-means and nonnegative spectral clustering with side information. IEEE Transactions on Image Processing 28(5):2152–2162MathSciNetMATH Rui Z, Feiping N, Muhan G, Xian W, Xuelong L (2018) Joint learning of fuzzy k-means and nonnegative spectral clustering with side information. IEEE Transactions on Image Processing 28(5):2152–2162MathSciNetMATH
37.
Zurück zum Zitat Tao SH, Xiaofeng Z, Zheng Z, Shui-Hua W, Yi C, Xing X, Jie S (2021) Heterogeneous data fusion for predicting mild cognitive impairment conversion. Information Fusion 66:54–63CrossRef Tao SH, Xiaofeng Z, Zheng Z, Shui-Hua W, Yi C, Xing X, Jie S (2021) Heterogeneous data fusion for predicting mild cognitive impairment conversion. Information Fusion 66:54–63CrossRef
38.
Zurück zum Zitat Xiaofeng Z, Jiangzhang G, Guangquan L, Jiaye L, Shichao Z (2020) Spectral clustering via half-quadratic optimization. World Wide Web 23:1969–1988CrossRef Xiaofeng Z, Jiangzhang G, Guangquan L, Jiaye L, Shichao Z (2020) Spectral clustering via half-quadratic optimization. World Wide Web 23:1969–1988CrossRef
39.
Zurück zum Zitat Xiaojun C, Guowen Y, Feiping N, Zhexue HJ (2017) Semi-supervised feature selection via rescaled linear regression. IJCAI 2017:1525–1531 Xiaojun C, Guowen Y, Feiping N, Zhexue HJ (2017) Semi-supervised feature selection via rescaled linear regression. IJCAI 2017:1525–1531
40.
Zurück zum Zitat Ingrid D, Ronald DV, Massimo Fornasier C, Sinan G (2010) Iteratively reweighted least squares minimization for sparse recovery. Communications on Pure and Applied Mathematics: A Journal Issued by the Courant Institute of Mathematical Sciences 63(1):1–38MathSciNetMATHCrossRef Ingrid D, Ronald DV, Massimo Fornasier C, Sinan G (2010) Iteratively reweighted least squares minimization for sparse recovery. Communications on Pure and Applied Mathematics: A Journal Issued by the Courant Institute of Mathematical Sciences 63(1):1–38MathSciNetMATHCrossRef
41.
Zurück zum Zitat Nie F, Wang H, Huang H, Ding C (2013) Adaptive loss minimization for semi-supervised elastic embedding. In Twenty-Third International Joint Conference on Artificial Intelligence Nie F, Wang H, Huang H, Ding C (2013) Adaptive loss minimization for semi-supervised elastic embedding. In Twenty-Third International Joint Conference on Artificial Intelligence
42.
Zurück zum Zitat Meng D, Zhao Q, Jiang L (2015) What objective does self-paced learning indeed optimize. arXiv preprint arXiv:1511.06049 Meng D, Zhao Q, Jiang L (2015) What objective does self-paced learning indeed optimize. arXiv preprint arXiv:1511.06049
43.
Zurück zum Zitat Srivastava Durgesh K, Lekha B (2010) Data classification using support vector machine. Journal of theoretical and applied information technology 12(1):1–7 Srivastava Durgesh K, Lekha B (2010) Data classification using support vector machine. Journal of theoretical and applied information technology 12(1):1–7
44.
Zurück zum Zitat Dong H, Xiaosha C, Chang-Dong W (2019) Unsupervised feature selection with multi-subspace randomization and collaboration. Knowledge-Based Systems 182:104856CrossRef Dong H, Xiaosha C, Chang-Dong W (2019) Unsupervised feature selection with multi-subspace randomization and collaboration. Knowledge-Based Systems 182:104856CrossRef
45.
Zurück zum Zitat Xuelong L, Han Z, Rui Z, Yun L, Feiping N (2018) Generalized uncorrelated regression with adaptive graph for unsupervised feature selection. IEEE transactions on neural networks and learning systems 30(5):1587–1595MathSciNet Xuelong L, Han Z, Rui Z, Yun L, Feiping N (2018) Generalized uncorrelated regression with adaptive graph for unsupervised feature selection. IEEE transactions on neural networks and learning systems 30(5):1587–1595MathSciNet
46.
Zurück zum Zitat Jia Z, Zhiming L, Candong L, Changen Z, Shaozi L (2019) Manifold regularized discriminative feature selection for multi-label learning. Pattern Recognition 95:136–150CrossRef Jia Z, Zhiming L, Candong L, Changen Z, Shaozi L (2019) Manifold regularized discriminative feature selection for multi-label learning. Pattern Recognition 95:136–150CrossRef
47.
Zurück zum Zitat Peng Hanyang, Fan Yong (2017) A general framework for sparsity regularized feature selection via iteratively reweighted least square minimization. In Thirty-First AAAI Conference on Artificial Intelligence, Peng Hanyang, Fan Yong (2017) A general framework for sparsity regularized feature selection via iteratively reweighted least square minimization. In Thirty-First AAAI Conference on Artificial Intelligence,
48.
Zurück zum Zitat Liu Meng, Xu Chang, Luo Yong, Xu Chao, Wen Yonggang, Tao Dacheng (2017) Cost-sensitive feature selection via f-measure optimization reduction. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, pages 2252–2258 Liu Meng, Xu Chang, Luo Yong, Xu Chao, Wen Yonggang, Tao Dacheng (2017) Cost-sensitive feature selection via f-measure optimization reduction. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, pages 2252–2258
49.
Zurück zum Zitat Rongyao H, Xiaofeng Z, Yonghua Z, Jiangzhang G (2020) Robust svm with adaptive graph learning. World Wide Web 23:1945–1968CrossRef Rongyao H, Xiaofeng Z, Yonghua Z, Jiangzhang G (2020) Robust svm with adaptive graph learning. World Wide Web 23:1945–1968CrossRef
Metadaten
Titel
A Robust Cost-Sensitive Feature Selection Via Self-Paced Learning Regularization
verfasst von
Yangding Li
Chaoqun Ma
Yiling Tao
Zehui Hu
Zidong Su
Meiling Liu
Publikationsdatum
05.03.2021
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 4/2022
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-021-10479-w

Weitere Artikel der Ausgabe 4/2022

Neural Processing Letters 4/2022 Zur Ausgabe

Neuer Inhalt