Skip to main content
Top

2017 | OriginalPaper | Chapter

An Optimal Multi-view Ensemble Learning for High Dimensional Data Classification Using Constrained Particle Swarm Optimization

Authors : Vipin Kumar, Sonajharia Minz

Published in: Information, Communication and Computing Technology

Publisher: Springer Singapore

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Multiple views of a dataset are constructed using the Feature Set Partitioning (FSP) methods for Multi-view Ensemble Learning (MEL). The way of partitioning of features influences the classification performance of MEL. The possible numbers of features set partition of the dataset are equal to the Bell number, which is in polynomial nature and a NP-hard problem (Shown in Fig. 1). It is essential to find an optimal classification performance of MEL for a features set partition among all possible features set partition in high dimension scenario. Therefore, an optimal multi-view ensemble learning approach using constrained particle swarm optimization method (OMEL-C-PSO) is proposed for high dimensional data classification. The experiments have been performed on ten high dimensional datasets. Using four exiting features set partitioning methods; the result shows that OMEL-C-PSO approach is feasible and effective for high dimensional data classification.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Xu, C., Tao, D., Xu, C.: A survey on multi-view learning. Learning (cs.LG) (2013) Xu, C., Tao, D., Xu, C.: A survey on multi-view learning. Learning (cs.LG) (2013)
2.
go back to reference Kakade, S.M., Foster, D.P.: Multi-view regression via canonical correlation analysis. In: COLT (2007) Kakade, S.M., Foster, D.P.: Multi-view regression via canonical correlation analysis. In: COLT (2007)
3.
go back to reference Ho, T.K.: Nearest neighbors in random subspaces. In: Proceedings of the Second International Workshop on Statistical Techniques in Pattern Recognition, Sydney, Australia, pp. 640–648 (1998) Ho, T.K.: Nearest neighbors in random subspaces. In: Proceedings of the Second International Workshop on Statistical Techniques in Pattern Recognition, Sydney, Australia, pp. 640–648 (1998)
4.
go back to reference Bay, S.: Nearestneighbour classification from multiple feature subsets. Intell. Data Anal. 3(3), 191–209 (1999)CrossRef Bay, S.: Nearestneighbour classification from multiple feature subsets. Intell. Data Anal. 3(3), 191–209 (1999)CrossRef
5.
go back to reference Bryll, R., Gutierrez-Osunaa, R., Quek, F.: Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recogn. 36, 1291–1302 (2003)CrossRefMATH Bryll, R., Gutierrez-Osunaa, R., Quek, F.: Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recogn. 36, 1291–1302 (2003)CrossRefMATH
6.
go back to reference Cunningham, P., Carney, J.: Diversity Versus Quality in Classification Ensembles Based on Feature Selection. In: López de Mántaras, R., Plaza, E. (eds.) ECML 2000. LNCS, vol. 1810, pp. 109–116. Springer, Heidelberg (2000). doi:10.1007/3-540-45164-1_12 CrossRef Cunningham, P., Carney, J.: Diversity Versus Quality in Classification Ensembles Based on Feature Selection. In: López de Mántaras, R., Plaza, E. (eds.) ECML 2000. LNCS, vol. 1810, pp. 109–116. Springer, Heidelberg (2000). doi:10.​1007/​3-540-45164-1_​12 CrossRef
7.
go back to reference Zenobi, G., Cunningham, P.: Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Proceedings of the European Conference on Machine Learning (2001) Zenobi, G., Cunningham, P.: Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Proceedings of the European Conference on Machine Learning (2001)
8.
go back to reference Wu, Q.X., Bell, D., McGinnity, M.: Multi-knowledge for decision making. Knowl. Inf. Syst. 7(2005), 246–266 (2005)CrossRef Wu, Q.X., Bell, D., McGinnity, M.: Multi-knowledge for decision making. Knowl. Inf. Syst. 7(2005), 246–266 (2005)CrossRef
9.
go back to reference Hu, Q.-H., Yu, D.-R., Wang, M.-Y.: Constructing Rough Decision Forests. In: Ślęzak, D., Yao, J., Peters, J.F., Ziarko, W., Hu, X. (eds.) RSFDGrC 2005. LNCS, vol. 3642, pp. 147–156. Springer, Heidelberg (2005). doi:10.1007/11548706_16 CrossRef Hu, Q.-H., Yu, D.-R., Wang, M.-Y.: Constructing Rough Decision Forests. In: Ślęzak, D., Yao, J., Peters, J.F., Ziarko, W., Hu, X. (eds.) RSFDGrC 2005. LNCS, vol. 3642, pp. 147–156. Springer, Heidelberg (2005). doi:10.​1007/​11548706_​16 CrossRef
10.
go back to reference Bao, Y., Ishii, N.: Combining Multiple K-Nearest Neighbor Classifiers for Text Classification by Reducts. In: Lange, S., Satoh, K., Smith, C.H. (eds.) DS 2002. LNCS, vol. 2534, pp. 340–347. Springer, Heidelberg (2002). doi:10.1007/3-540-36182-0_34 CrossRef Bao, Y., Ishii, N.: Combining Multiple K-Nearest Neighbor Classifiers for Text Classification by Reducts. In: Lange, S., Satoh, K., Smith, C.H. (eds.) DS 2002. LNCS, vol. 2534, pp. 340–347. Springer, Heidelberg (2002). doi:10.​1007/​3-540-36182-0_​34 CrossRef
11.
go back to reference Rodriguez, J.J.: Rotation forest: a new classifier ensemble method. IEEE Trans. Pattern Anal. Mach. Intell. 20(10), 1619–1630 (2006)CrossRef Rodriguez, J.J.: Rotation forest: a new classifier ensemble method. IEEE Trans. Pattern Anal. Mach. Intell. 20(10), 1619–1630 (2006)CrossRef
12.
go back to reference Rokach, L., Maimon, O., Arad, O.: Improving supervised learning by sample decomposition. Int. J. Comput. Intell. Appl. 5(1), 37–54 (2005)CrossRef Rokach, L., Maimon, O., Arad, O.: Improving supervised learning by sample decomposition. Int. J. Comput. Intell. Appl. 5(1), 37–54 (2005)CrossRef
13.
go back to reference Rokach, L.: Pattern Classification Using Ensemble Methods. Series in Machine Perception and Artifical Intelligence, vol. 75. World Scientific Publishing Company, Singapore (2010) Rokach, L.: Pattern Classification Using Ensemble Methods. Series in Machine Perception and Artifical Intelligence, vol. 75. World Scientific Publishing Company, Singapore (2010)
14.
go back to reference Kudo, M., Sklansky, J.: A Comparative evaluation of medium and large-scale feature selectors for pattern classifiers. In: Proceedings of the 1st International Workshop on Statistical Techniques in Pattern Recognition, Prague, Czech Republic, pp. 91–96 (1997) Kudo, M., Sklansky, J.: A Comparative evaluation of medium and large-scale feature selectors for pattern classifiers. In: Proceedings of the 1st International Workshop on Statistical Techniques in Pattern Recognition, Prague, Czech Republic, pp. 91–96 (1997)
15.
go back to reference Bluma, A.L., Langley, P.: Selection of relevant features and examples in machine learning. In: Greiner, R., Subramanian, D. (eds.) Artificial Intelligence on Relevance, Artificial Intelligence, vol. 97, pp. 245–271 (1997) Bluma, A.L., Langley, P.: Selection of relevant features and examples in machine learning. In: Greiner, R., Subramanian, D. (eds.) Artificial Intelligence on Relevance, Artificial Intelligence, vol. 97, pp. 245–271 (1997)
16.
go back to reference Brefeld, U., Gartner, C., Scheffe, T.: Multi-view Discriminative sequential learning. Machine Learning. In: ECML 2005, pp. 60–71 (2005) Brefeld, U., Gartner, C., Scheffe, T.: Multi-view Discriminative sequential learning. Machine Learning. In: ECML 2005, pp. 60–71 (2005)
17.
go back to reference Almuallim, H., Dietterich, T.G.: Learning Boolean concepts in the presence of many irrelevant features. Artif. Intell. 69(1–2), 279–305 (1994)MathSciNetCrossRefMATH Almuallim, H., Dietterich, T.G.: Learning Boolean concepts in the presence of many irrelevant features. Artif. Intell. 69(1–2), 279–305 (1994)MathSciNetCrossRefMATH
18.
go back to reference Ben-Bassat, M.: Pattern recognition and reduction of dimensionality. In: Krishnaiah, P.R., Kanal, L.N. (eds.) Handbook of statistics-II, pp. 773–791. Elsevier, Amsterdam (1982) Ben-Bassat, M.: Pattern recognition and reduction of dimensionality. In: Krishnaiah, P.R., Kanal, L.N. (eds.) Handbook of statistics-II, pp. 773–791. Elsevier, Amsterdam (1982)
19.
go back to reference Hall, M.A.: Correlation-based feature selection for discrete and numeric class machine learning. In: Proceedings of the 17th International Conference on Machine Learning, pp. 359–366 (2000) Hall, M.A.: Correlation-based feature selection for discrete and numeric class machine learning. In: Proceedings of the 17th International Conference on Machine Learning, pp. 359–366 (2000)
20.
go back to reference Eberhart, R., Kennedy, J.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948 (1995) Eberhart, R., Kennedy, J.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)
21.
go back to reference Valle, Y., Venayaganmoorthy, G.K., Mohagheghi, S., Hernandez Ronald, J., Harley, G.: Particle swarm optimization: basic concepts, variants and applications in power system. IEEE Trans. Evolut. Comput. 12(2), 171–195 (2008)CrossRef Valle, Y., Venayaganmoorthy, G.K., Mohagheghi, S., Hernandez Ronald, J., Harley, G.: Particle swarm optimization: basic concepts, variants and applications in power system. IEEE Trans. Evolut. Comput. 12(2), 171–195 (2008)CrossRef
22.
go back to reference Engelbrecht, A.P.: Particle swarm optimization: Where does it belong? In: Proceedings of the IEEE Swarm Intelligence Symposium, pp. 48–54, May (2006) Engelbrecht, A.P.: Particle swarm optimization: Where does it belong? In: Proceedings of the IEEE Swarm Intelligence Symposium, pp. 48–54, May (2006)
23.
go back to reference Bai, Q.: Analysis of particle swarm optimization algorithm. Comput. Inf. Sci. 3, 180–184 (2010) Bai, Q.: Analysis of particle swarm optimization algorithm. Comput. Inf. Sci. 3, 180–184 (2010)
24.
go back to reference Jordehi, A.R., Jasni, J.: Heuristic methods for solution of FACTS optimization problem in power systems. IEEE SCRD 2011, 30–35 (2011) Jordehi, A.R., Jasni, J.: Heuristic methods for solution of FACTS optimization problem in power systems. IEEE SCRD 2011, 30–35 (2011)
25.
go back to reference Chen,B., Feng, X.: CSV-PSO and its application in geotechnical engineering. Swarm Intelligence, Focus on Ant and Particle Swarm Optimization (2007) Chen,B., Feng, X.: CSV-PSO and its application in geotechnical engineering. Swarm Intelligence, Focus on Ant and Particle Swarm Optimization (2007)
26.
go back to reference Zhang, Y., Gallipoli, D., Augarde, C.: Parallel Hybrid Particle Swarm Optimization and Applications in Geotechnical Engineering. In: Cai, Z., Li, Z., Kang, Z., Liu, Y. (eds.) ISICA 2009. LNCS, vol. 5821, pp. 466–475. Springer, Heidelberg (2009). doi:10.1007/978-3-642-04843-2_49 CrossRef Zhang, Y., Gallipoli, D., Augarde, C.: Parallel Hybrid Particle Swarm Optimization and Applications in Geotechnical Engineering. In: Cai, Z., Li, Z., Kang, Z., Liu, Y. (eds.) ISICA 2009. LNCS, vol. 5821, pp. 466–475. Springer, Heidelberg (2009). doi:10.​1007/​978-3-642-04843-2_​49 CrossRef
27.
go back to reference Hsieh, S., Sun, T., Liu, C., Lin, C.: An improved particle swarm optimizer for placement constraints. J. Artif. Evolut. Appl. 13, (2008) Hsieh, S., Sun, T., Liu, C., Lin, C.: An improved particle swarm optimizer for placement constraints. J. Artif. Evolut. Appl. 13, (2008)
28.
go back to reference Al Rashidi, M.R., AlHajri, M.F., Al-Othman, A.K., El-Naggar, K.M.: Particle swarm optimization and its applications in power systems. Comput. Intell. Power Eng. 302, 295–324 (2010)CrossRef Al Rashidi, M.R., AlHajri, M.F., Al-Othman, A.K., El-Naggar, K.M.: Particle swarm optimization and its applications in power systems. Comput. Intell. Power Eng. 302, 295–324 (2010)CrossRef
29.
go back to reference Singh, N., Arya, R., Agrawal, R.K.: A novel approach to combine features for salient object detection using constrained particle swarm optimization. Pattern Recogn. 47, 1731–1739 (2014)CrossRef Singh, N., Arya, R., Agrawal, R.K.: A novel approach to combine features for salient object detection using constrained particle swarm optimization. Pattern Recogn. 47, 1731–1739 (2014)CrossRef
30.
go back to reference Sapankevych, N.I., Sankar, R.: Constrained motion particle swarm optimization and support vector regression for non-linear time series regression and prediction applications. In: 12th International Conference on Machine Learning and Applications (ICMLA), vol. 2. IEEE (2013) Sapankevych, N.I., Sankar, R.: Constrained motion particle swarm optimization and support vector regression for non-linear time series regression and prediction applications. In: 12th International Conference on Machine Learning and Applications (ICMLA), vol. 2. IEEE (2013)
32.
go back to reference Kusiak, A.: Decomposition in Data Mining: An Industrial Case Study. IEEE Trans. Electron. Packag. Manuf. 23(4), 345–353 (2000)CrossRef Kusiak, A.: Decomposition in Data Mining: An Industrial Case Study. IEEE Trans. Electron. Packag. Manuf. 23(4), 345–353 (2000)CrossRef
33.
go back to reference Liao, Y., Moody, J.: Constructing heterogeneous committees via input feature grouping. In: Solla, S.A., Leen, T.K., Muller, K.-R. (eds.) Advances in neural information processing systems, vol. 12, pp. 921–927. MIT Press, Cambridge (2000) Liao, Y., Moody, J.: Constructing heterogeneous committees via input feature grouping. In: Solla, S.A., Leen, T.K., Muller, K.-R. (eds.) Advances in neural information processing systems, vol. 12, pp. 921–927. MIT Press, Cambridge (2000)
34.
go back to reference Tumer, K., Ghosh, J.: Error correlation and error reduction in ensemble classifiers. Connect. Sci. 8(3–4), 385–404 (1996)CrossRef Tumer, K., Ghosh, J.: Error correlation and error reduction in ensemble classifiers. Connect. Sci. 8(3–4), 385–404 (1996)CrossRef
35.
go back to reference Rokach, L.: Genetic algorithm-based feature set partitioning for classification problems. Pattern Recogn. 41(5), 1676–1700 (2008)CrossRefMATH Rokach, L.: Genetic algorithm-based feature set partitioning for classification problems. Pattern Recogn. 41(5), 1676–1700 (2008)CrossRefMATH
36.
go back to reference Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Diversity in search strategies for ensemble feature selection. Inf. Fusion 6(1), 83–98 (2005)CrossRef Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Diversity in search strategies for ensemble feature selection. Inf. Fusion 6(1), 83–98 (2005)CrossRef
37.
go back to reference Gunter, S., Bunke, H.: Feature selection algorithms for the generation of multiple classifier systems. Pattern Recogn. Lett. 25(11), 1323–1336 (2004)CrossRef Gunter, S., Bunke, H.: Feature selection algorithms for the generation of multiple classifier systems. Pattern Recogn. Lett. 25(11), 1323–1336 (2004)CrossRef
38.
go back to reference Sun, S., Jin, F. Tu, W.: View construction for multi-view semi-supervised learning. Advances in Neural Networks–ISNN 2011, pp. 595–60 (2011) Sun, S., Jin, F. Tu, W.: View construction for multi-view semi-supervised learning. Advances in Neural Networks–ISNN 2011, pp. 595–60 (2011)
39.
go back to reference Ho, T.K.: The random subspace method for constructing decision forests. Pattern Anal. Mach. Intell. IEEE Trans. 20(8), 832–844 (1998)CrossRef Ho, T.K.: The random subspace method for constructing decision forests. Pattern Anal. Mach. Intell. IEEE Trans. 20(8), 832–844 (1998)CrossRef
40.
go back to reference Tao, D., Tang, X., Li, X., Wu, X.: Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. Pattern Anal. Mach. Intell. IEEE Trans. 28(7), 1088–1099 (2006)CrossRef Tao, D., Tang, X., Li, X., Wu, X.: Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. Pattern Anal. Mach. Intell. IEEE Trans. 28(7), 1088–1099 (2006)CrossRef
41.
go back to reference Floudas, C.A., Pardalos, P.M.: A Collection of Test Problems for Constrained Global Optimization Algorithms. LNCS, vol. 455. Springer, Heidelberg (1990). doi:10.1007/3-540-53032-0 MATH Floudas, C.A., Pardalos, P.M.: A Collection of Test Problems for Constrained Global Optimization Algorithms. LNCS, vol. 455. Springer, Heidelberg (1990). doi:10.​1007/​3-540-53032-0 MATH
42.
go back to reference Parsopoulos, K., Vrahatis, M.: Particle swarm optimization method for con- strained optimization problems. Intell. Technol. 16(2002), 214–220 (2002)MATH Parsopoulos, K., Vrahatis, M.: Particle swarm optimization method for con- strained optimization problems. Intell. Technol. 16(2002), 214–220 (2002)MATH
43.
go back to reference Yang, J.-M., Chen, Y.-P., Horng, J.-T., Kao, C.-Y.: Applying family competition to evolution strategies for constrained optimization. In: Angeline, P.J., Reynolds, R.G., McDonnell, J.R., Eberhart, R. (eds.) EP 1997. LNCS, vol. 1213, pp. 201–211. Springer, Heidelberg (1997). doi:10.1007/BFb0014812 CrossRef Yang, J.-M., Chen, Y.-P., Horng, J.-T., Kao, C.-Y.: Applying family competition to evolution strategies for constrained optimization. In: Angeline, P.J., Reynolds, R.G., McDonnell, J.R., Eberhart, R. (eds.) EP 1997. LNCS, vol. 1213, pp. 201–211. Springer, Heidelberg (1997). doi:10.​1007/​BFb0014812 CrossRef
44.
go back to reference Homaifar, A., Lai, A.H.Y., Qi, X.: Constrained optimization via genetic algorithms. Simulation 2(4), 242–254 (1994)CrossRef Homaifar, A., Lai, A.H.Y., Qi, X.: Constrained optimization via genetic algorithms. Simulation 2(4), 242–254 (1994)CrossRef
45.
go back to reference Ahmed, A., Esmin, A., Coelho, R.A., Matwin, S.: A review on particle swarm optimization algorithm and its variant to clustering high- dimensional data. Artif. Intel. Rev. 44(1), 23–45 (2013) Ahmed, A., Esmin, A., Coelho, R.A., Matwin, S.: A review on particle swarm optimization algorithm and its variant to clustering high- dimensional data. Artif. Intel. Rev. 44(1), 23–45 (2013)
46.
go back to reference Kennedy, J.: The behavior of particles. Evol. Progr. VII(1998), 581–587 (1998) Kennedy, J.: The behavior of particles. Evol. Progr. VII(1998), 581–587 (1998)
47.
go back to reference Clerc, M.: The swarm and the queen: Towards a deterministic and adaptive particle swarm optimization. In: Congress on Evolutionary Computation (CEC 1999), pp. 1951–1957 (1999) Clerc, M.: The swarm and the queen: Towards a deterministic and adaptive particle swarm optimization. In: Congress on Evolutionary Computation (CEC 1999), pp. 1951–1957 (1999)
48.
go back to reference Opitz, D.: Feature Selection for Ensembles. In: Proceedings 16th National Conference on Artificial Intelligence. AAAI, pp. 379–384 (1999) Opitz, D.: Feature Selection for Ensembles. In: Proceedings 16th National Conference on Artificial Intelligence. AAAI, pp. 379–384 (1999)
53.
go back to reference Minz S., Kumar, V.: reinforced multi-view ensemble learning for high dimensional data classification. In: International Conference on Communication and Computing (ICC 2014), Elsevier (2014) Minz S., Kumar, V.: reinforced multi-view ensemble learning for high dimensional data classification. In: International Conference on Communication and Computing (ICC 2014), Elsevier (2014)
54.
go back to reference Kumar, V., Minz, S.: Multi-view Ensemble Learning for Poem Data Classification Using SentiWordNet. In: Kumar Kundu, M., Mohapatra, D.P., Konar, A., Chakraborty, A. (eds.) Advanced Computing, Networking and Informatics- Volume 1. SIST, vol. 27, pp. 57–66. Springer, Cham (2014). doi:10.1007/978-3-319-07353-8_8 CrossRef Kumar, V., Minz, S.: Multi-view Ensemble Learning for Poem Data Classification Using SentiWordNet. In: Kumar Kundu, M., Mohapatra, D.P., Konar, A., Chakraborty, A. (eds.) Advanced Computing, Networking and Informatics- Volume 1. SIST, vol. 27, pp. 57–66. Springer, Cham (2014). doi:10.​1007/​978-3-319-07353-8_​8 CrossRef
55.
go back to reference Kumar, V., Minz, S.: Feature selection: a literature review. Smart Comput. Rev. KAIS 4(3), 211–229 (2014) Kumar, V., Minz, S.: Feature selection: a literature review. Smart Comput. Rev. KAIS 4(3), 211–229 (2014)
56.
go back to reference Breiman, L.: Bagging predictor. Mach. Learn. 24, 123–140 (1996)MATH Breiman, L.: Bagging predictor. Mach. Learn. 24, 123–140 (1996)MATH
57.
go back to reference Sun, S., Jin, F., Tu, W.: View construction for multi-view semi-supervised learning. Advances in Neural Networks–ISNN 2011, pp. 595–601 (2011) Sun, S., Jin, F., Tu, W.: View construction for multi-view semi-supervised learning. Advances in Neural Networks–ISNN 2011, pp. 595–601 (2011)
58.
go back to reference Branson, D.: Stirling numbers and Bell numbers: their role in combinatoricsand probability. Math. Sci. 25, 1–31 (2000)MathSciNetMATH Branson, D.: Stirling numbers and Bell numbers: their role in combinatoricsand probability. Math. Sci. 25, 1–31 (2000)MathSciNetMATH
60.
go back to reference Garcia, S., Herrera, F.: An extension on statistical comparison of classifiers over multiple datasets for all pair-wise comparisons. Mach. Learn. Res. 09, 2677–2694 (2008)MATH Garcia, S., Herrera, F.: An extension on statistical comparison of classifiers over multiple datasets for all pair-wise comparisons. Mach. Learn. Res. 09, 2677–2694 (2008)MATH
61.
go back to reference Kumar, V., Minz, S.: Multi-view ensemble learning: an optimal feature set partitioning for high dimensional data classification. Knowl. Inf. Syst. 49(01), 1–59 (2015)CrossRef Kumar, V., Minz, S.: Multi-view ensemble learning: an optimal feature set partitioning for high dimensional data classification. Knowl. Inf. Syst. 49(01), 1–59 (2015)CrossRef
Metadata
Title
An Optimal Multi-view Ensemble Learning for High Dimensional Data Classification Using Constrained Particle Swarm Optimization
Authors
Vipin Kumar
Sonajharia Minz
Copyright Year
2017
Publisher
Springer Singapore
DOI
https://doi.org/10.1007/978-981-10-6544-6_33

Premium Partner