Skip to main content
Erschienen in: International Journal of Machine Learning and Cybernetics 11/2019

24.08.2019 | Original Article

Classifying imbalanced data using ensemble of reduced kernelized weighted extreme learning machine

verfasst von: Bhagat Singh Raghuwanshi, Sanyam Shukla

Erschienen in: International Journal of Machine Learning and Cybernetics | Ausgabe 11/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Many real-world applications are imbalance classification problems, where the number of samples present in one class is significantly less than the number of samples belonging to another class. The samples with larger and smaller class proportions are called majority and minority class respectively. Weighted extreme learning machine (WELM) was designed to handle the class imbalance problem. Several works such as boosting WELM (BWELM) and ensemble WELM extended WELM by using the ensemble method. All these variant use WELM with the sigmoid node to handle the class imbalance problem effectively. WELM with the sigmoid node suffers from the problem of performance fluctuation due to the random initialization of the weights between the input and the hidden layer. Hybrid artificial bee colony optimization-based WELM extends WELM by finding the optimal weights between the input and the hidden layer by using artificial bee colony optimization algorithm. The computational cost of the kernelized ELM is directly proportional to the number of kernel functions. So, this work proposes a novel ensemble using reduced kernelized WELM as the base classifier to solve the class imbalance problem more effectively. The proposed work uses random undersampling to design balanced training subsets, which act as the centroid of the reduced kernelized WELM classifier. The proposed ensemble generates the base classifier in a sequential manner. The majority class samples misclassified by the first base classifier along with all of the minority class samples act as the centroids of the second base classifier i.e. the base classifiers differ in the choice of the centroids for the kernel functions. The proposed work also has lower computational cost compared to BWELM. The proposed method is evaluated by utilizing the benchmark real-world imbalanced datasets taken from the KEEL dataset repository. The proposed method was also tested on binary synthetic datasets in order to analyze its robustness. The experimental results demonstrate the superiority of the proposed algorithm compared to the other state-of-the-art methods for the class imbalance learning. This is also revealed by the statistical tests conducted.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Weitere Produktempfehlungen anzeigen
Literatur
1.
Zurück zum Zitat Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501CrossRef Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501CrossRef
2.
Zurück zum Zitat Rumelhart D E, Hinton G E, Williams R J (1986) Learning representations by back propagating errors. Nature 323:533–536MATHCrossRef Rumelhart D E, Hinton G E, Williams R J (1986) Learning representations by back propagating errors. Nature 323:533–536MATHCrossRef
3.
Zurück zum Zitat Janakiraman VM, Nguyen X, Sterniak J, Assanis D (2015) Identification of the dynamic operating envelope of hcci engines using class imbalance learning. IEEE Trans Neural Netw Learn Syst 26(1):98–112MathSciNetCrossRef Janakiraman VM, Nguyen X, Sterniak J, Assanis D (2015) Identification of the dynamic operating envelope of hcci engines using class imbalance learning. IEEE Trans Neural Netw Learn Syst 26(1):98–112MathSciNetCrossRef
4.
Zurück zum Zitat Janakiraman VM, Nguyen X, Assanis D (2016) Stochastic gradient based extreme learning machines for stable online learning of advanced combustion engines. Neurocomputing 177:304–316CrossRef Janakiraman VM, Nguyen X, Assanis D (2016) Stochastic gradient based extreme learning machines for stable online learning of advanced combustion engines. Neurocomputing 177:304–316CrossRef
5.
Zurück zum Zitat Zong W, Huang GB, Chen Y (2013) Weighted extreme learning machine for imbalance learning. Neurocomputing 101:229–242CrossRef Zong W, Huang GB, Chen Y (2013) Weighted extreme learning machine for imbalance learning. Neurocomputing 101:229–242CrossRef
6.
Zurück zum Zitat Li K, Kong X, Lu Z, Wenyin L, Yin J (2014) Boosting weighted ELM for imbalanced learning. Neurocomputing 128:15–21CrossRef Li K, Kong X, Lu Z, Wenyin L, Yin J (2014) Boosting weighted ELM for imbalanced learning. Neurocomputing 128:15–21CrossRef
7.
Zurück zum Zitat Zhang Y, Liu B, Cai J, Zhang S (2017) Ensemble weighted extreme learning machine for imbalanced data classification based on differential evolution. Neural Comput Appl 28(1):259–267CrossRef Zhang Y, Liu B, Cai J, Zhang S (2017) Ensemble weighted extreme learning machine for imbalanced data classification based on differential evolution. Neural Comput Appl 28(1):259–267CrossRef
8.
Zurück zum Zitat Raghuwanshi BS, Shukla S (2018a) Class-specific cost-sensitive boosting weighted elm for class imbalance learning. Memet Comput 2018:1–21 Raghuwanshi BS, Shukla S (2018a) Class-specific cost-sensitive boosting weighted elm for class imbalance learning. Memet Comput 2018:1–21
9.
Zurück zum Zitat Raghuwanshi BS, Shukla S (2018b) Underbagging based reduced kernelized weighted extreme learning machine for class imbalance learning. Eng Appl Artif Intell 74:252–270CrossRef Raghuwanshi BS, Shukla S (2018b) Underbagging based reduced kernelized weighted extreme learning machine for class imbalance learning. Eng Appl Artif Intell 74:252–270CrossRef
10.
Zurück zum Zitat Shukla S, Yadav RN (2015) Regularized weighted circular complex-valued extreme learning machine for imbalanced learning. IEEE Access 3:3048–3057CrossRef Shukla S, Yadav RN (2015) Regularized weighted circular complex-valued extreme learning machine for imbalanced learning. IEEE Access 3:3048–3057CrossRef
11.
Zurück zum Zitat Raghuwanshi BS, Shukla S (2018a) Class-specific extreme learning machine for handling binary class imbalance problem. Neural Netw 105:206–217CrossRef Raghuwanshi BS, Shukla S (2018a) Class-specific extreme learning machine for handling binary class imbalance problem. Neural Netw 105:206–217CrossRef
12.
Zurück zum Zitat Raghuwanshi BS, Shukla S (2018b) Class-specific kernelized extreme learning machine for binary class imbalance learning. Appl Soft Comput 73:1026–1038CrossRef Raghuwanshi BS, Shukla S (2018b) Class-specific kernelized extreme learning machine for binary class imbalance learning. Appl Soft Comput 73:1026–1038CrossRef
13.
Zurück zum Zitat Raghuwanshi BS, Shukla S (2019) Generalized class-specific kernelized extreme learning machine for multiclass imbalanced learning. Expert Syst Appl 121:244–255CrossRef Raghuwanshi BS, Shukla S (2019) Generalized class-specific kernelized extreme learning machine for multiclass imbalanced learning. Expert Syst Appl 121:244–255CrossRef
14.
Zurück zum Zitat Xiao W, Zhang J, Li Y, Zhang S, Yang W (2017) Class-specific cost regulation extreme learning machine for imbalanced classification. Neurocomputing 261:70–82CrossRef Xiao W, Zhang J, Li Y, Zhang S, Yang W (2017) Class-specific cost regulation extreme learning machine for imbalanced classification. Neurocomputing 261:70–82CrossRef
15.
Zurück zum Zitat Tang X, Chen L (2018) Artificial bee colony optimization-based weighted extreme learning machine for imbalanced data learning. Clust Comput 2018:1–6 Tang X, Chen L (2018) Artificial bee colony optimization-based weighted extreme learning machine for imbalanced data learning. Clust Comput 2018:1–6
16.
Zurück zum Zitat Raghuwanshi BS, Shukla S (2019) Class imbalance learning using underbagging based kernelized extreme learning machine. Neurocomputing 329:172–187CrossRef Raghuwanshi BS, Shukla S (2019) Class imbalance learning using underbagging based kernelized extreme learning machine. Neurocomputing 329:172–187CrossRef
17.
Zurück zum Zitat Iosifidis A, Gabbouj M (2015) On the kernel extreme learning machine speedup. Pattern Recognit Lett 68:205–210CrossRef Iosifidis A, Gabbouj M (2015) On the kernel extreme learning machine speedup. Pattern Recognit Lett 68:205–210CrossRef
18.
Zurück zum Zitat Iosifidis A, Tefas A, Pitas I (2015) On the kernel extreme learning machine classifier. Pattern Recognit Lett 54:11–17CrossRef Iosifidis A, Tefas A, Pitas I (2015) On the kernel extreme learning machine classifier. Pattern Recognit Lett 54:11–17CrossRef
19.
Zurück zum Zitat Iosifidis A, Tefas A, Pitas I (2017) Approximate kernel extreme learning machine for large scale data classification. Neurocomputing 219:210–220CrossRef Iosifidis A, Tefas A, Pitas I (2017) Approximate kernel extreme learning machine for large scale data classification. Neurocomputing 219:210–220CrossRef
20.
Zurück zum Zitat Schapire RE (1999) A brief introduction to boosting. In: Proceedings of the 16th international joint conference on artificial intelligence, volume 2, IJCAI’99, pp 1401–1406 Schapire RE (1999) A brief introduction to boosting. In: Proceedings of the 16th international joint conference on artificial intelligence, volume 2, IJCAI’99, pp 1401–1406
21.
Zurück zum Zitat Lee YJ, Huang SY (2007) Reduced support vector machines: a statistical theory. IEEE Trans Neural Netw 18(1):1–13CrossRef Lee YJ, Huang SY (2007) Reduced support vector machines: a statistical theory. IEEE Trans Neural Netw 18(1):1–13CrossRef
22.
Zurück zum Zitat Williams CKI, Seeger M (2001) Using the nyström method to speed up kernel machines. In: Leen TK, Dietterich TG, Tresp V (eds) Advances in neural information processing systems, vol 13. MIT, Oxford, pp 682–688 Williams CKI, Seeger M (2001) Using the nyström method to speed up kernel machines. In: Leen TK, Dietterich TG, Tresp V (eds) Advances in neural information processing systems, vol 13. MIT, Oxford, pp 682–688
23.
Zurück zum Zitat Deng W, Zheng Q, Zhang K (2013) Reduced kernel extreme learning machine. Springer, Heidelberg, pp 63–69 Deng W, Zheng Q, Zhang K (2013) Reduced kernel extreme learning machine. Springer, Heidelberg, pp 63–69
24.
Zurück zum Zitat Deng WY, Ong YS, Zheng QH (2016) A fast reduced kernel extreme learning machine. Neural Netw 76:29–38MATHCrossRef Deng WY, Ong YS, Zheng QH (2016) A fast reduced kernel extreme learning machine. Neural Netw 76:29–38MATHCrossRef
25.
Zurück zum Zitat Polikar R (2006) Ensemble based systems in decision making. IEEE Circuit Syst Mag 6(3):21–45CrossRef Polikar R (2006) Ensemble based systems in decision making. IEEE Circuit Syst Mag 6(3):21–45CrossRef
27.
Zurück zum Zitat Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140MATH Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140MATH
28.
29.
Zurück zum Zitat Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Proceedings of the thirteenth international conference on machine learning, Morgan Kaufmann, pp 148–156 Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Proceedings of the thirteenth international conference on machine learning, Morgan Kaufmann, pp 148–156
30.
Zurück zum Zitat Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRef Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRef
31.
Zurück zum Zitat Seiffert C, Khoshgoftaar TM, Hulse JV, Napolitano A (2010) Rusboost: a hybrid approach to alleviating class imbalance. IEEE Trans Syst Man Cybern Part A Syst Hum 40(1):185–197CrossRef Seiffert C, Khoshgoftaar TM, Hulse JV, Napolitano A (2010) Rusboost: a hybrid approach to alleviating class imbalance. IEEE Trans Syst Man Cybern Part A Syst Hum 40(1):185–197CrossRef
32.
Zurück zum Zitat Chawla NV, Lazarevic A, Hall LO, Bowyer KW (2003) Smoteboost: improving prediction of the minority class in boosting. In: Lavrač N, Gamberger D, Todorovski L, Blockeel H (eds) Knowledge discovery in databases: PKDD 2003. Springer, Berlin, pp 107–119CrossRef Chawla NV, Lazarevic A, Hall LO, Bowyer KW (2003) Smoteboost: improving prediction of the minority class in boosting. In: Lavrač N, Gamberger D, Todorovski L, Blockeel H (eds) Knowledge discovery in databases: PKDD 2003. Springer, Berlin, pp 107–119CrossRef
33.
Zurück zum Zitat Džeroski S, Ženko B (2004) Is combining classifiers with stacking better than selecting the best one? Mach Learn 54(3):255–273MATHCrossRef Džeroski S, Ženko B (2004) Is combining classifiers with stacking better than selecting the best one? Mach Learn 54(3):255–273MATHCrossRef
34.
Zurück zum Zitat Zhou Z (2012) Ensemble methods: foundations and algorithms. Chapman & Hall/CRC Data Mining and Knowledge Discovery Serie, Taylor & Francis, AbingdonCrossRef Zhou Z (2012) Ensemble methods: foundations and algorithms. Chapman & Hall/CRC Data Mining and Knowledge Discovery Serie, Taylor & Francis, AbingdonCrossRef
35.
Zurück zum Zitat Haixiang G, Yijing L, Shang J, Mingyun G, Yuanyue H, Bing G (2017) Learning from class-imbalanced data: review of methods and applications. Expert Syst Appl 73:220–239CrossRef Haixiang G, Yijing L, Shang J, Mingyun G, Yuanyue H, Bing G (2017) Learning from class-imbalanced data: review of methods and applications. Expert Syst Appl 73:220–239CrossRef
36.
Zurück zum Zitat He H, Garcia EA (2009) Learning from imbalanced data. IEEE Trans Knowl Data Eng 21(9):1263–1284CrossRef He H, Garcia EA (2009) Learning from imbalanced data. IEEE Trans Knowl Data Eng 21(9):1263–1284CrossRef
37.
Zurück zum Zitat López V, Fernández A, García S, Palade V, Herrera F (2013) An insight into classification with imbalanced data: empirical results and current trends on using data intrinsic characteristics. Inf Sci 250:113–141CrossRef López V, Fernández A, García S, Palade V, Herrera F (2013) An insight into classification with imbalanced data: empirical results and current trends on using data intrinsic characteristics. Inf Sci 250:113–141CrossRef
38.
Zurück zum Zitat Brown I, Mues C (2012) An experimental comparison of classification algorithms for imbalanced credit scoring data sets. Expert Syst Appl 39(3):3446–3453CrossRef Brown I, Mues C (2012) An experimental comparison of classification algorithms for imbalanced credit scoring data sets. Expert Syst Appl 39(3):3446–3453CrossRef
39.
Zurück zum Zitat Xiao J, Xie L, He C, Jiang X (2012) Dynamic classifier ensemble model for customer classification with imbalanced class distribution. Expert Syst Appl 39(3):3668–3675CrossRef Xiao J, Xie L, He C, Jiang X (2012) Dynamic classifier ensemble model for customer classification with imbalanced class distribution. Expert Syst Appl 39(3):3668–3675CrossRef
40.
Zurück zum Zitat Krawczyk B, Galar M, Jele L, Herrera F (2016) Evolutionary undersampling boosting for imbalanced classification of breast cancer malignancy. Appl Soft Comput 38(C):714–726CrossRef Krawczyk B, Galar M, Jele L, Herrera F (2016) Evolutionary undersampling boosting for imbalanced classification of breast cancer malignancy. Appl Soft Comput 38(C):714–726CrossRef
41.
Zurück zum Zitat Galar M, Fernandez A, Barrenechea E, Bustince H, Herrera F (2012) A review on ensembles for the class imbalance problem: bagging-, boosting-, and hybrid-based approaches. IEEE Trans Syst Man Cybern Part C (Applications and Reviews) 42(4):463–484CrossRef Galar M, Fernandez A, Barrenechea E, Bustince H, Herrera F (2012) A review on ensembles for the class imbalance problem: bagging-, boosting-, and hybrid-based approaches. IEEE Trans Syst Man Cybern Part C (Applications and Reviews) 42(4):463–484CrossRef
42.
Zurück zum Zitat Liu XY, Wu J, Zhou ZH (2009) Exploratory undersampling for class-imbalance learning. IEEE Trans Syst Man Cybern Part B (Cybernetics) 39(2):539–550CrossRef Liu XY, Wu J, Zhou ZH (2009) Exploratory undersampling for class-imbalance learning. IEEE Trans Syst Man Cybern Part B (Cybernetics) 39(2):539–550CrossRef
43.
Zurück zum Zitat Chawla NV, Bowyer KW, Hall LO, Kegelmeyer WP (2002) Smote: synthetic minority over-sampling technique. J Artif Int Res 16(1):321–357MATH Chawla NV, Bowyer KW, Hall LO, Kegelmeyer WP (2002) Smote: synthetic minority over-sampling technique. J Artif Int Res 16(1):321–357MATH
44.
Zurück zum Zitat Mathew J, Pang CK, Luo M, Leong WH (2018) Classification of imbalanced data by oversampling in kernel space of support vector machines. IEEE Trans Neural Netw Learn Syst 29:1–12CrossRef Mathew J, Pang CK, Luo M, Leong WH (2018) Classification of imbalanced data by oversampling in kernel space of support vector machines. IEEE Trans Neural Netw Learn Syst 29:1–12CrossRef
45.
Zurück zum Zitat Yang X, Song Q, WANG Y (2007) A weighted support vector machine for data classification. Int J Pattern Recognit Artif Intell 21(05):961–976CrossRef Yang X, Song Q, WANG Y (2007) A weighted support vector machine for data classification. Int J Pattern Recognit Artif Intell 21(05):961–976CrossRef
46.
Zurück zum Zitat Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems, Springer, pp 1–15 Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems, Springer, pp 1–15
47.
Zurück zum Zitat Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B (Cybernetics) 42(2):513–529CrossRef Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B (Cybernetics) 42(2):513–529CrossRef
48.
Zurück zum Zitat Deng W, Zheng Q, Chen L (2009) Regularized extreme learning machine. In: IEEE symposium on computational intelligence and data mining, pp 389–395 Deng W, Zheng Q, Chen L (2009) Regularized extreme learning machine. In: IEEE symposium on computational intelligence and data mining, pp 389–395
49.
Zurück zum Zitat Zhu QY, Qin A, Suganthan P, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recognit 38(10):1759–1763MATHCrossRef Zhu QY, Qin A, Suganthan P, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recognit 38(10):1759–1763MATHCrossRef
50.
Zurück zum Zitat Zhao YP (2016) Parsimonious kernel extreme learning machine in primal via cholesky factorization. Neural Netw 80:95–109MATHCrossRef Zhao YP (2016) Parsimonious kernel extreme learning machine in primal via cholesky factorization. Neural Netw 80:95–109MATHCrossRef
51.
Zurück zum Zitat Zeng Y, Xu X, Shen D, Fang Y, Xiao Z (2017) Traffic sign recognition using kernel extreme learning machines with deep perceptual features. IEEE Trans Intell Transp Syst 18(6):1647–1653 Zeng Y, Xu X, Shen D, Fang Y, Xiao Z (2017) Traffic sign recognition using kernel extreme learning machines with deep perceptual features. IEEE Trans Intell Transp Syst 18(6):1647–1653
52.
Zurück zum Zitat Courrieu P (2005) Fast computation of moore-penrose inverse matrices. CoRR abs/0804.4809 Courrieu P (2005) Fast computation of moore-penrose inverse matrices. CoRR abs/0804.4809
53.
Zurück zum Zitat Hoerl AE, Kennard RW (2000) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 42(1):80–86MATHCrossRef Hoerl AE, Kennard RW (2000) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 42(1):80–86MATHCrossRef
54.
Zurück zum Zitat Alcalá J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Multiple Valued Logic Soft Comput 17(2–3):255–287 Alcalá J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Multiple Valued Logic Soft Comput 17(2–3):255–287
56.
Zurück zum Zitat Yu H, Yang X, Zheng S, Sun C (2018) Active learning from imbalanced data: a solution of online weighted extreme learning machine. IEEE Trans Neural Netw Learn Syst 30:1–16 Yu H, Yang X, Zheng S, Sun C (2018) Active learning from imbalanced data: a solution of online weighted extreme learning machine. IEEE Trans Neural Netw Learn Syst 30:1–16
57.
Zurück zum Zitat Bradley AP (1997) The use of the area under the roc curve in the evaluation of machine learning algorithms. Pattern Recognit 30(7):1145–1159CrossRef Bradley AP (1997) The use of the area under the roc curve in the evaluation of machine learning algorithms. Pattern Recognit 30(7):1145–1159CrossRef
58.
Zurück zum Zitat Fawcett T (2003) Roc graphs: notes and practical considerations for researchers. Tech. rep., HP Labs, Tech. Rep. HPL-2003-4 Fawcett T (2003) Roc graphs: notes and practical considerations for researchers. Tech. rep., HP Labs, Tech. Rep. HPL-2003-4
59.
Zurück zum Zitat Huang J, Ling CX (2005) Using auc and accuracy in evaluating learning algorithms. IEEE Trans Knowl Data Eng 17(3):299–310CrossRef Huang J, Ling CX (2005) Using auc and accuracy in evaluating learning algorithms. IEEE Trans Knowl Data Eng 17(3):299–310CrossRef
60.
Zurück zum Zitat Buda M, Maki A, Mazurowski MA (2018) A systematic study of the class imbalance problem in convolutional neural networks. Neural Netw 106:249–259CrossRef Buda M, Maki A, Mazurowski MA (2018) A systematic study of the class imbalance problem in convolutional neural networks. Neural Netw 106:249–259CrossRef
61.
Zurück zum Zitat Chen Z, Lin T, Xia X, Xu H, Ding S (2018) A synthetic neighborhood generation based ensemble learning for the imbalanced data classification. Appl Intell 48(8):2441–2457CrossRef Chen Z, Lin T, Xia X, Xu H, Ding S (2018) A synthetic neighborhood generation based ensemble learning for the imbalanced data classification. Appl Intell 48(8):2441–2457CrossRef
62.
Zurück zum Zitat Nanni L, Fantozzi C, Lazzarini N (2015) Coupling different methods for overcoming the class imbalance problem. Neurocomputing 158(C):48–61CrossRef Nanni L, Fantozzi C, Lazzarini N (2015) Coupling different methods for overcoming the class imbalance problem. Neurocomputing 158(C):48–61CrossRef
63.
Zurück zum Zitat Hu S, Liang Y, Ma L, He Y (2009) Msmote: improving classification performance when training data is imbalanced. In: 2009 Second international workshop on computer science and engineering, vol 2, pp 13–17 Hu S, Liang Y, Ma L, He Y (2009) Msmote: improving classification performance when training data is imbalanced. In: 2009 Second international workshop on computer science and engineering, vol 2, pp 13–17
64.
Zurück zum Zitat Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH
Metadaten
Titel
Classifying imbalanced data using ensemble of reduced kernelized weighted extreme learning machine
verfasst von
Bhagat Singh Raghuwanshi
Sanyam Shukla
Publikationsdatum
24.08.2019
Verlag
Springer Berlin Heidelberg
Erschienen in
International Journal of Machine Learning and Cybernetics / Ausgabe 11/2019
Print ISSN: 1868-8071
Elektronische ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-019-01001-9

Weitere Artikel der Ausgabe 11/2019

International Journal of Machine Learning and Cybernetics 11/2019 Zur Ausgabe

Neuer Inhalt