Skip to main content
Top
Published in: International Journal of Machine Learning and Cybernetics 4/2019

11-12-2017 | Original Article

Neighborhood attribute reduction: a multi-criterion approach

Authors: Jingzheng Li, Xibei Yang, Xiaoning Song, Jinhai Li, Pingxin Wang, Dong-Jun Yu

Published in: International Journal of Machine Learning and Cybernetics | Issue 4/2019

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Though attribute reduction defined by neighborhood decision error rate can improve the classification performance of neighborhood classifier via deleting redundant attributes, such reduction does not take the variations of classification results into account. To fill this gap, a multi-criterion based attribute reduction is proposed, which considers both neighborhood decision error rate and neighborhood decision consistency. The neighborhood decision consistency is used to measure the variations of classification results if attributes change. Following the novel attribute reduction, a heuristic algorithm is also designed to derive reduct which aims to obtain less error rate and higher consistency simultaneously. The experimental results on 10 UCI data sets show that the multi-criterion based reduction can not only improve the decision consistencies without decreasing the classification accuracies significantly, but also bring us more stable reducts. This study suggests new trends concerning criteria and constraints in attribute reduction.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Show more products
Literature
1.
go back to reference Azam N, Yao JT (2014) Game-theoretic rough sets for recommender systems. Knowl Based Syst 72:96–107CrossRef Azam N, Yao JT (2014) Game-theoretic rough sets for recommender systems. Knowl Based Syst 72:96–107CrossRef
2.
go back to reference Chen HM, Li TR, Luo C, Wang GY (2015) A decision-theoretic rough set approach for dynamic data mining. IEEE Trans Fuzzy Syst 23:1–14CrossRef Chen HM, Li TR, Luo C, Wang GY (2015) A decision-theoretic rough set approach for dynamic data mining. IEEE Trans Fuzzy Syst 23:1–14CrossRef
3.
go back to reference Chen Y (2016) An adjustable multigranulation fuzzy rough set. Int J Mach Learn Cybern 7:1–8CrossRef Chen Y (2016) An adjustable multigranulation fuzzy rough set. Int J Mach Learn Cybern 7:1–8CrossRef
6.
go back to reference Daoud EA (2015) An efficient algorithm for finding a fuzzy rough set reduct using an improved harmony search. Int J Modern Educ Comput Sci 7:16–23CrossRef Daoud EA (2015) An efficient algorithm for finding a fuzzy rough set reduct using an improved harmony search. Int J Modern Educ Comput Sci 7:16–23CrossRef
7.
go back to reference Dou HL, Yang XB, Song XN, Yu HL, Wu WZ (2016) Decision-theoretic rough set: a multicost strategy. Knowl Based Syst 91:71–83CrossRef Dou HL, Yang XB, Song XN, Yu HL, Wu WZ (2016) Decision-theoretic rough set: a multicost strategy. Knowl Based Syst 91:71–83CrossRef
8.
go back to reference Guo YW, Jiao LC, Wang S, Wang S, Liu F, Rong KX, Xiong T (2014) A novel dynamic rough subspace based selective ensemble. Pattern Recognit 48:1638–1652CrossRef Guo YW, Jiao LC, Wang S, Wang S, Liu F, Rong KX, Xiong T (2014) A novel dynamic rough subspace based selective ensemble. Pattern Recognit 48:1638–1652CrossRef
9.
go back to reference Hu QH, Pedrycz W, Yu DR, Liang J (2010) Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans Syst Man Cybern Part B (Cybernetics). 40:137–150CrossRef Hu QH, Pedrycz W, Yu DR, Liang J (2010) Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans Syst Man Cybern Part B (Cybernetics). 40:137–150CrossRef
10.
go back to reference Hu QH, Yu DR, Xie ZX (2008) Neighborhood classifiers. Expert Syst Appl 34:866–876CrossRef Hu QH, Yu DR, Xie ZX (2008) Neighborhood classifiers. Expert Syst Appl 34:866–876CrossRef
11.
go back to reference Hu QH, Yu DR, Xie ZX, Li XD (2007) EROS: ensemble rough subspaces. Pattern Recognit 40:3728–3739CrossRefMATH Hu QH, Yu DR, Xie ZX, Li XD (2007) EROS: ensemble rough subspaces. Pattern Recognit 40:3728–3739CrossRefMATH
12.
go back to reference Ju HR, Li HX, Yang XB, Huang B (2017) Cost-sensitive rough set: a multi-granulation approach. Knowl Based Syst 123:137–153CrossRef Ju HR, Li HX, Yang XB, Huang B (2017) Cost-sensitive rough set: a multi-granulation approach. Knowl Based Syst 123:137–153CrossRef
13.
go back to reference Ju HR, Yang XB, Yu H, Li TJ, Yu DJ, Yang JY (2016) Cost-sensitive rough set approach. Inf Sci 355–356:282–298CrossRef Ju HR, Yang XB, Yu H, Li TJ, Yu DJ, Yang JY (2016) Cost-sensitive rough set approach. Inf Sci 355–356:282–298CrossRef
14.
go back to reference Ju HR, Yang XB, Song XN (2014) Dynamic updating multigranulation fuzzy rough set: approximations and reducts. Int J Mach Learn Cybern 5:981–990CrossRef Ju HR, Yang XB, Song XN (2014) Dynamic updating multigranulation fuzzy rough set: approximations and reducts. Int J Mach Learn Cybern 5:981–990CrossRef
15.
go back to reference Korytkowski M, Rutkowski L, Scherer R (2015) Fast image classification by boosting fuzzy classifiers. Inf Sci 327:175–182MathSciNetCrossRef Korytkowski M, Rutkowski L, Scherer R (2015) Fast image classification by boosting fuzzy classifiers. Inf Sci 327:175–182MathSciNetCrossRef
16.
go back to reference Kuncheva L, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207CrossRefMATH Kuncheva L, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207CrossRefMATH
17.
18.
go back to reference Li SQ, Harner EJ, Adjeroh DA (2011) Random KNN feature selection-a fast and stable alternative to random forests. BMC Bioinf 12:1–11CrossRef Li SQ, Harner EJ, Adjeroh DA (2011) Random KNN feature selection-a fast and stable alternative to random forests. BMC Bioinf 12:1–11CrossRef
19.
20.
go back to reference Min F, He HP, Qian YH, Zhu W (2011) Test-cost-sensitive attribute reduction. Inf Sci 181:4928–4942CrossRef Min F, He HP, Qian YH, Zhu W (2011) Test-cost-sensitive attribute reduction. Inf Sci 181:4928–4942CrossRef
22.
go back to reference Qian YH, Liang JY, Pedrycz W, Dang CY (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618MathSciNetCrossRefMATH Qian YH, Liang JY, Pedrycz W, Dang CY (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618MathSciNetCrossRefMATH
23.
go back to reference Sneath P, Sokal R (1975) Numerical taxonomy. J Geol 193:855–860 Sneath P, Sokal R (1975) Numerical taxonomy. J Geol 193:855–860
24.
go back to reference Sim J, Wright CC (2005) The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys Ther 85:257–268 Sim J, Wright CC (2005) The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys Ther 85:257–268
25.
go back to reference Skalak DB (1996) The sources of increased accuracy for two proposed boosting algorithms. American Association for Artificial Intelligence, Integrating Multiple Learned MODELS Workshop 120–125 Skalak DB (1996) The sources of increased accuracy for two proposed boosting algorithms. American Association for Artificial Intelligence, Integrating Multiple Learned MODELS Workshop 120–125
26.
go back to reference Tohka J, Moradi E, Huttunen H (2016) Comparison of feature selection techniques in machine learning for anatomical brain MRI in dementia. Neuroinformatics. 14:1–18CrossRef Tohka J, Moradi E, Huttunen H (2016) Comparison of feature selection techniques in machine learning for anatomical brain MRI in dementia. Neuroinformatics. 14:1–18CrossRef
27.
go back to reference Tsang ECC, Hu QH, Chen DG (2016) Feature and instance reduction for PNN classifiers based on fuzzy rough sets. Int J Mach Learn Cybern 7:1–11CrossRef Tsang ECC, Hu QH, Chen DG (2016) Feature and instance reduction for PNN classifiers based on fuzzy rough sets. Int J Mach Learn Cybern 7:1–11CrossRef
29.
go back to reference Wang CZ, Shao MW, He Q, Qian YH, Qi YL (2016) Feature subset selection based on fuzzy neighborhood rough sets. Knowl Based Syst 111:173–179CrossRef Wang CZ, Shao MW, He Q, Qian YH, Qi YL (2016) Feature subset selection based on fuzzy neighborhood rough sets. Knowl Based Syst 111:173–179CrossRef
31.
go back to reference Wang H, Jing XJ, Niu B (2017) A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl Based Syst 126:8–19CrossRef Wang H, Jing XJ, Niu B (2017) A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl Based Syst 126:8–19CrossRef
32.
go back to reference Wang H, Niu B (2017) A novel bacterial algorithm with randomness control for feature selection in classification. Neurocomputing 228:176–186CrossRef Wang H, Niu B (2017) A novel bacterial algorithm with randomness control for feature selection in classification. Neurocomputing 228:176–186CrossRef
33.
go back to reference Xu SP, Yang XB, Yu HL, Tsang ECC (2016) Multi-label learning with label-specific feature reduction. Knowl Based Syst 104:52–61CrossRef Xu SP, Yang XB, Yu HL, Tsang ECC (2016) Multi-label learning with label-specific feature reduction. Knowl Based Syst 104:52–61CrossRef
34.
go back to reference Xu J, Xie SL, Zhu WK (2017) Marginal patch alignment for dimensionality reduction. Soft Comput 21:2347–2356CrossRef Xu J, Xie SL, Zhu WK (2017) Marginal patch alignment for dimensionality reduction. Soft Comput 21:2347–2356CrossRef
35.
go back to reference Xu J, Gu ZH, Xie K (2016) Fuzzy local mean discriminant analysis for dimensionality reduction. Neural Process Lett 44:701–718CrossRef Xu J, Gu ZH, Xie K (2016) Fuzzy local mean discriminant analysis for dimensionality reduction. Neural Process Lett 44:701–718CrossRef
36.
go back to reference Yang XB, Qi Y, Yu HL, Yang JY (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl Based Syst 64:59–69CrossRef Yang XB, Qi Y, Yu HL, Yang JY (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl Based Syst 64:59–69CrossRef
37.
go back to reference Yang XB, Zhang M, Dou HL, Yang JY (2011) Neighborhood systems-based rough sets in incomplete information system. Knowl Based Syst 24:858–867CrossRef Yang XB, Zhang M, Dou HL, Yang JY (2011) Neighborhood systems-based rough sets in incomplete information system. Knowl Based Syst 24:858–867CrossRef
38.
go back to reference Yao YY, Zhang XY (2017) Class-specific attribute reducts in rough set theory. Inf Sci 418:601–618CrossRef Yao YY, Zhang XY (2017) Class-specific attribute reducts in rough set theory. Inf Sci 418:601–618CrossRef
39.
go back to reference Yule GU (1900) On the association of attributes in statistics. Philos Trans R Soc A: Math Phys Eng Sci 194:257–319CrossRefMATH Yule GU (1900) On the association of attributes in statistics. Philos Trans R Soc A: Math Phys Eng Sci 194:257–319CrossRefMATH
40.
go back to reference Zhai JH, Zhang SF, Wang CX (2017) The classification of imbalanced large data sets based on MapReduce and ensemble of ELM classifiers. Int J Mach Learn Cybern 8:1009–1017CrossRef Zhai JH, Zhang SF, Wang CX (2017) The classification of imbalanced large data sets based on MapReduce and ensemble of ELM classifiers. Int J Mach Learn Cybern 8:1009–1017CrossRef
41.
go back to reference Zhao H, Wang P, Hu QH (2016) Cost-sensitive feature selection based on adaptive neighborhood granularity with multi-level confidence. Inf Sci 366:134–149MathSciNetCrossRef Zhao H, Wang P, Hu QH (2016) Cost-sensitive feature selection based on adaptive neighborhood granularity with multi-level confidence. Inf Sci 366:134–149MathSciNetCrossRef
42.
go back to reference Zhang X, Mei CL, Chen DG, Li JH (2016) Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy. Pattern Recognit 56:1–15CrossRef Zhang X, Mei CL, Chen DG, Li JH (2016) Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy. Pattern Recognit 56:1–15CrossRef
Metadata
Title
Neighborhood attribute reduction: a multi-criterion approach
Authors
Jingzheng Li
Xibei Yang
Xiaoning Song
Jinhai Li
Pingxin Wang
Dong-Jun Yu
Publication date
11-12-2017
Publisher
Springer Berlin Heidelberg
Published in
International Journal of Machine Learning and Cybernetics / Issue 4/2019
Print ISSN: 1868-8071
Electronic ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-017-0758-5

Other articles of this Issue 4/2019

International Journal of Machine Learning and Cybernetics 4/2019 Go to the issue