Skip to main content
Top
Published in: Arabian Journal for Science and Engineering 9/2021

11-03-2021 | Research Article-Computer Engineering and Computer Science

Optimal Selection of Features Using Artificial Electric Field Algorithm for Classification

Authors: Himansu Das, Bighnaraj Naik, H. S. Behera

Published in: Arabian Journal for Science and Engineering | Issue 9/2021

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The high-dimensional features in the data may affect the performance of the classification model as all of them are not useful. The selection of relevant optimal features is a tedious task, especially the data in which the number of features is high. This paper proposed a new feature selection (FS) approach based on the artificial electric field algorithm (AEFA) called FSAEFA, to select the most suitable optimal features for classification. The AEFA is an efficient and effective population-based optimization technique inspired by the principle of Coulomb's law of electrostatic force (CLEF) to solve optimization problems. The proposed FSAEFA approach can search the most suited subset of features by attracting the worst individual features towards the best individual features by using the attraction of CLEF. The proposed FSAEFA approach has been evaluated and compared with some other FS approaches on ten publicly available benchmark datasets. The experimental result has been indicated that the performance of the proposed method is superior over the existing methods in most of the cases. The significance of the proposed FS approach along with their counter parts have been statistically measured and compared by using Friedman test and Holm procedure. It has been determined that the proposed FSAEFA approach is found to be more efficient than the existing FS approaches.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Guyon, I.; Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3(Mar), 1157–1182 (2003)MATH Guyon, I.; Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3(Mar), 1157–1182 (2003)MATH
2.
go back to reference Li, J.; Cheng, K.; Wang, S.; Morstatter, F.; Trevino, R.P.; Tang, J.; Liu, H.: Feature selection: a data perspective. ACM Comput. Surv. (CSUR) 50(6), 94 (2018)CrossRef Li, J.; Cheng, K.; Wang, S.; Morstatter, F.; Trevino, R.P.; Tang, J.; Liu, H.: Feature selection: a data perspective. ACM Comput. Surv. (CSUR) 50(6), 94 (2018)CrossRef
3.
go back to reference Peng, Y.; Wu, Z.; Jiang, J.: A novel feature selection approach for biomedical data classification. J. Biomed. Inform. 43(1), 15–23 (2010)CrossRef Peng, Y.; Wu, Z.; Jiang, J.: A novel feature selection approach for biomedical data classification. J. Biomed. Inform. 43(1), 15–23 (2010)CrossRef
4.
go back to reference Bikku, T.; Nandam, S.R.; Akepogu, A.R.: A contemporary feature selection and classification framework for imbalanced biomedical datasets. Egypt. Inform. J. 19(3), 191–198 (2018)CrossRef Bikku, T.; Nandam, S.R.; Akepogu, A.R.: A contemporary feature selection and classification framework for imbalanced biomedical datasets. Egypt. Inform. J. 19(3), 191–198 (2018)CrossRef
5.
go back to reference Drotár, P.; Gazda, J.; Smékal, Z.: An experimental comparison of feature selection methods on two-class biomedical datasets. Comput. Biol. Med. 66, 1–10 (2015)CrossRef Drotár, P.; Gazda, J.; Smékal, Z.: An experimental comparison of feature selection methods on two-class biomedical datasets. Comput. Biol. Med. 66, 1–10 (2015)CrossRef
6.
go back to reference Pradhan, C.; Das, H.; Naik, B.; Dey, N.: Handbook of Research on Information Security in Biomedical Signal Processing, p. 1–414. IGI Global, Hershey (2018)CrossRef Pradhan, C.; Das, H.; Naik, B.; Dey, N.: Handbook of Research on Information Security in Biomedical Signal Processing, p. 1–414. IGI Global, Hershey (2018)CrossRef
7.
go back to reference Uysal, A.K.: An improved global feature selection scheme for text classification. Expert Syst. Appl. 43, 82–92 (2016)CrossRef Uysal, A.K.: An improved global feature selection scheme for text classification. Expert Syst. Appl. 43, 82–92 (2016)CrossRef
8.
go back to reference Ghareb, A.S.; Bakar, A.A.; Hamdan, A.R.: Hybrid feature selection based on enhanced genetic algorithm for text categorization. Expert Syst. Appl. 49, 31–47 (2016)CrossRef Ghareb, A.S.; Bakar, A.A.; Hamdan, A.R.: Hybrid feature selection based on enhanced genetic algorithm for text categorization. Expert Syst. Appl. 49, 31–47 (2016)CrossRef
9.
go back to reference Phinyomark, A.; Phukpattaranont, P.; Limsakul, C.: Feature reduction and selection for EMG signal classification. Expert Syst. Appl. 39(8), 7420–7431 (2012)CrossRef Phinyomark, A.; Phukpattaranont, P.; Limsakul, C.: Feature reduction and selection for EMG signal classification. Expert Syst. Appl. 39(8), 7420–7431 (2012)CrossRef
10.
go back to reference Das, H.; Rout, J.K.; Moharana, S.C.; Dey, N. (eds.): Applied Intelligent Decision Making in Machine Learning. CRC Press, Boca Raton (2020) Das, H.; Rout, J.K.; Moharana, S.C.; Dey, N. (eds.): Applied Intelligent Decision Making in Machine Learning. CRC Press, Boca Raton (2020)
11.
go back to reference Rout, J.K.; Rout, M.; Das, H.: Machine Learning for Intelligent Decision Science. Springer, Singapore (2020)MATHCrossRef Rout, J.K.; Rout, M.; Das, H.: Machine Learning for Intelligent Decision Science. Springer, Singapore (2020)MATHCrossRef
12.
go back to reference Rout, M.; Rout, J.K.; Das, H.: Nature Inspired Computing for Data Science. Springer, Berlin (2020)MATHCrossRef Rout, M.; Rout, J.K.; Das, H.: Nature Inspired Computing for Data Science. Springer, Berlin (2020)MATHCrossRef
13.
go back to reference Das, H.; Dey, N.; Balas, V.E. (eds.): Real-Time Data Analytics for Large Scale Sensor Data. Academic Press, Cambridge (2019) Das, H.; Dey, N.; Balas, V.E. (eds.): Real-Time Data Analytics for Large Scale Sensor Data. Academic Press, Cambridge (2019)
14.
go back to reference Dey, N.; Das, H.; Naik, B.; Behera, H.S. (eds.): Big Data Analytics for Intelligent Healthcare Management. Academic Press, Cambridge (2019) Dey, N.; Das, H.; Naik, B.; Behera, H.S. (eds.): Big Data Analytics for Intelligent Healthcare Management. Academic Press, Cambridge (2019)
15.
go back to reference Jain, I.; Jain, V.K.; Jain, R.: Correlation feature selection based improved-binary particle swarm optimization for gene selection and cancer classification. Appl. Soft Comput. 62, 203–215 (2018)CrossRef Jain, I.; Jain, V.K.; Jain, R.: Correlation feature selection based improved-binary particle swarm optimization for gene selection and cancer classification. Appl. Soft Comput. 62, 203–215 (2018)CrossRef
16.
go back to reference Bolón-Canedo, V.; Sánchez-Marono, N.; Alonso-Betanzos, A.; Benítez, J.M.; Herrera, F.: A review of microarray datasets and applied feature selection methods. Inf. Sci. 282, 111–135 (2014)CrossRef Bolón-Canedo, V.; Sánchez-Marono, N.; Alonso-Betanzos, A.; Benítez, J.M.; Herrera, F.: A review of microarray datasets and applied feature selection methods. Inf. Sci. 282, 111–135 (2014)CrossRef
17.
go back to reference Das, H.; Naik, B.; Behera, H.S.; Jaiswal, S.; Mahato, P.; Rout, M.: Biomedical data analysis using neuro-fuzzy model with post-feature reduction. J. King Saud Univ. Comput. Inf. Sci. (2020) Das, H.; Naik, B.; Behera, H.S.; Jaiswal, S.; Mahato, P.; Rout, M.: Biomedical data analysis using neuro-fuzzy model with post-feature reduction. J. King Saud Univ. Comput. Inf. Sci. (2020)
18.
go back to reference Das, H.; Naik, B.; Behera, H.S.: Medical disease analysis using neuro-fuzzy with feature extraction model for classification. Inform. Med. Unlocked 18, 100288 (2020)CrossRef Das, H.; Naik, B.; Behera, H.S.: Medical disease analysis using neuro-fuzzy with feature extraction model for classification. Inform. Med. Unlocked 18, 100288 (2020)CrossRef
19.
go back to reference Das, H., Naik, B., Behera, H.S.: A hybrid neuro-fuzzy and feature reduction model for classification. Adv. Fuzzy Syst. 2020, 1–15 (2020)CrossRef Das, H., Naik, B., Behera, H.S.: A hybrid neuro-fuzzy and feature reduction model for classification. Adv. Fuzzy Syst. 2020, 1–15 (2020)CrossRef
20.
go back to reference Huang, S.H.: Supervised feature selection: a tutorial. Artif. Intell. Res. 4(2), 22–37 (2015)CrossRef Huang, S.H.: Supervised feature selection: a tutorial. Artif. Intell. Res. 4(2), 22–37 (2015)CrossRef
21.
go back to reference Das, H.; Naik, B.; Behera, H.S.: A Jaya algorithm based wrapper method for optimal feature selection in supervised classification. J. King Saud Univ. Comput. Inf. Sci. (2020) Das, H.; Naik, B.; Behera, H.S.: A Jaya algorithm based wrapper method for optimal feature selection in supervised classification. J. King Saud Univ. Comput. Inf. Sci. (2020)
22.
go back to reference Das, H.; Chakraborty, S.; Acharya, B.; Sahoo, A.K.:11 Optimal selection of features using teaching-learning-based optimization algorithm for classification. In: Applied Intelligent Decision Making in Machine Learning, pp. 213–227. CRC Press (2020) Das, H.; Chakraborty, S.; Acharya, B.; Sahoo, A.K.:11 Optimal selection of features using teaching-learning-based optimization algorithm for classification. In: Applied Intelligent Decision Making in Machine Learning, pp. 213–227. CRC Press (2020)
23.
go back to reference Das, H.; Naik, B.; Behera, H.S.: Classification of diabetes mellitus disease (DMD): a data mining (DM) approach. In: Progress in Computing, Analytics and Networking, pp. 539–549. Springer, Singapore (2018) Das, H.; Naik, B.; Behera, H.S.: Classification of diabetes mellitus disease (DMD): a data mining (DM) approach. In: Progress in Computing, Analytics and Networking, pp. 539–549. Springer, Singapore (2018)
24.
go back to reference Sahani, R.; Rout, C.; Badajena, J. C.; Jena, A. K.; Das, H.: Classification of intrusion detection using data mining techniques. In: Progress in Computing, Analytics and Networking, pp. 753–764. Springer, Singapore (2018) Sahani, R.; Rout, C.; Badajena, J. C.; Jena, A. K.; Das, H.: Classification of intrusion detection using data mining techniques. In: Progress in Computing, Analytics and Networking, pp. 753–764. Springer, Singapore (2018)
25.
go back to reference Das, H.; Jena, A.K.; Nayak, J.; Naik, B.; Behera, H.S.: A novel PSO based back propagation learning-MLP (PSO-BP-MLP) for classification. In: Computational Intelligence in Data Mining, vol. 2, pp. 461–471. Springer, New Delhi (2015) Das, H.; Jena, A.K.; Nayak, J.; Naik, B.; Behera, H.S.: A novel PSO based back propagation learning-MLP (PSO-BP-MLP) for classification. In: Computational Intelligence in Data Mining, vol. 2, pp. 461–471. Springer, New Delhi (2015)
26.
go back to reference Das, H.; Naik, B.; Behera, H.S.: Disease classification using linguistic neuro-fuzzy model. In: Progress in Computing, Analytics and Networking, pp. 45–53. Springer, Singapore (2020) Das, H.; Naik, B.; Behera, H.S.: Disease classification using linguistic neuro-fuzzy model. In: Progress in Computing, Analytics and Networking, pp. 45–53. Springer, Singapore (2020)
27.
go back to reference Das, H.; Naik, B.; Behera, H.S.: An experimental analysis of machine learning classification algorithms on biomedical data. In: Proceedings of the 2nd International Conference on Communication, Devices and Computing, pp. 525–539. Springer, Singapore (2020) Das, H.; Naik, B.; Behera, H.S.: An experimental analysis of machine learning classification algorithms on biomedical data. In: Proceedings of the 2nd International Conference on Communication, Devices and Computing, pp. 525–539. Springer, Singapore (2020)
28.
go back to reference Kiziloz, H.E.; Deniz, A.; Dokeroglu, T.; Cosar, A.: Novel multiobjective TLBO algorithms for the feature subset selection problem. Neurocomputing 306, 94–107 (2018)CrossRef Kiziloz, H.E.; Deniz, A.; Dokeroglu, T.; Cosar, A.: Novel multiobjective TLBO algorithms for the feature subset selection problem. Neurocomputing 306, 94–107 (2018)CrossRef
29.
go back to reference Chandrashekar, G.; Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)CrossRef Chandrashekar, G.; Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)CrossRef
30.
go back to reference Kohavi, R.; John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)MATHCrossRef Kohavi, R.; John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1–2), 273–324 (1997)MATHCrossRef
31.
go back to reference Whitley, D.: A genetic algorithm tutorial. Stat. Comput. 4(2), 65–85 (1994)CrossRef Whitley, D.: A genetic algorithm tutorial. Stat. Comput. 4(2), 65–85 (1994)CrossRef
32.
go back to reference Kabir, M.M.; Shahjahan, M.; Murase, K.: A new local search based hybrid genetic algorithm for feature selection. Neurocomputing 74(17), 2914–2928 (2011)CrossRef Kabir, M.M.; Shahjahan, M.; Murase, K.: A new local search based hybrid genetic algorithm for feature selection. Neurocomputing 74(17), 2914–2928 (2011)CrossRef
33.
go back to reference Qin, A.K.; Huang, V.L.; Suganthan, P.N.: Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans. Evol. Comput. 13(2), 398–417 (2008)CrossRef Qin, A.K.; Huang, V.L.; Suganthan, P.N.: Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans. Evol. Comput. 13(2), 398–417 (2008)CrossRef
34.
go back to reference Marini, F.; Walczak, B.: Particle swarm optimization (PSO): a tutorial. Chemom. Intell. Lab. Syst. 149, 153–165 (2015)CrossRef Marini, F.; Walczak, B.: Particle swarm optimization (PSO): a tutorial. Chemom. Intell. Lab. Syst. 149, 153–165 (2015)CrossRef
35.
go back to reference Dorigo, M.; Birattari, M.: Ant Colony Optimization, p. 36–39. Springer, Berlin (2010) Dorigo, M.; Birattari, M.: Ant Colony Optimization, p. 36–39. Springer, Berlin (2010)
36.
go back to reference Aghdam, M.H.; Ghasem-Aghaee, N.; Basiri, M.E.: Text feature selection using ant colony optimization. Expert Syst. Appl. 36(3), 6843–6853 (2009)CrossRef Aghdam, M.H.; Ghasem-Aghaee, N.; Basiri, M.E.: Text feature selection using ant colony optimization. Expert Syst. Appl. 36(3), 6843–6853 (2009)CrossRef
37.
go back to reference Ma, B.; Xia, Y.: A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl. Soft Comput. 58, 328–338 (2017)CrossRef Ma, B.; Xia, Y.: A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl. Soft Comput. 58, 328–338 (2017)CrossRef
38.
go back to reference Khushaba, R.N.; Al-Ani, A.; Al-Jumaily, A.: Feature subset selection using differential evolution and a statistical repair mechanism. Expert Syst. Appl. 38(9), 11515–11526 (2011)CrossRef Khushaba, R.N.; Al-Ani, A.; Al-Jumaily, A.: Feature subset selection using differential evolution and a statistical repair mechanism. Expert Syst. Appl. 38(9), 11515–11526 (2011)CrossRef
39.
go back to reference Al-Ani, A.; Alsukker, A.; Khushaba, R.N.: Feature subset selection using differential evolution and a wheel based search strategy. Swarm Evol. Comput. 9, 15–26 (2013)CrossRef Al-Ani, A.; Alsukker, A.; Khushaba, R.N.: Feature subset selection using differential evolution and a wheel based search strategy. Swarm Evol. Comput. 9, 15–26 (2013)CrossRef
40.
go back to reference Vivekanandan, T.; Iyengar, N.C.S.N.: Optimal feature selection using a modified differential evolution algorithm and its effectiveness for prediction of heart disease. Comput. Biol. Med. 90, 125–136 (2017)CrossRef Vivekanandan, T.; Iyengar, N.C.S.N.: Optimal feature selection using a modified differential evolution algorithm and its effectiveness for prediction of heart disease. Comput. Biol. Med. 90, 125–136 (2017)CrossRef
41.
go back to reference Islam, S.M.; Das, S.; Ghosh, S.; Roy, S.; Suganthan, P.N.: An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization. IEEE Trans Syst Man Cybern Part B Cybern 42(2), 482–500 (2011)CrossRef Islam, S.M.; Das, S.; Ghosh, S.; Roy, S.; Suganthan, P.N.: An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization. IEEE Trans Syst Man Cybern Part B Cybern 42(2), 482–500 (2011)CrossRef
42.
go back to reference Too, J.; Abdullah, A.R.; Mohd-Saad, N.: A new co-evolution binary particle swarm optimization with multiple inertia weight strategy for feature selection. In: Informatics, vol. 6, No. 2, p. 21. Multidisciplinary Digital Publishing Institute (2019) Too, J.; Abdullah, A.R.; Mohd-Saad, N.: A new co-evolution binary particle swarm optimization with multiple inertia weight strategy for feature selection. In: Informatics, vol. 6, No. 2, p. 21. Multidisciplinary Digital Publishing Institute (2019)
43.
go back to reference Yadav, A.: AEFA: Artificial electric field algorithm for global optimization. Swarm Evol. Comput. 48, 93–108 (2019)CrossRef Yadav, A.: AEFA: Artificial electric field algorithm for global optimization. Swarm Evol. Comput. 48, 93–108 (2019)CrossRef
44.
go back to reference Yadav, A.; Kumar, N.: Artificial electric field algorithm for engineering optimization problems. Expert Syst. Appl. 149, 113308 (2020)CrossRef Yadav, A.; Kumar, N.: Artificial electric field algorithm for engineering optimization problems. Expert Syst. Appl. 149, 113308 (2020)CrossRef
45.
go back to reference Selem, S.I.; El-Fergany, A.A.; Hasanien, H.M.: Artificial electric field algorithm to extract nine parameters of triple-diode photovoltaic model. Int. J. Energy Res. 45, 590–604 (2020)CrossRef Selem, S.I.; El-Fergany, A.A.; Hasanien, H.M.: Artificial electric field algorithm to extract nine parameters of triple-diode photovoltaic model. Int. J. Energy Res. 45, 590–604 (2020)CrossRef
46.
go back to reference Demirören, A.; Hekimoğlu, B.; Ekinci, S.; Kaya, S.: Artificial electric field algorithm for determining controller parameters in AVR system. In: 2019 International Artificial Intelligence and Data Processing Symposium (IDAP), pp. 1–7. IEEE (2019, September) Demirören, A.; Hekimoğlu, B.; Ekinci, S.; Kaya, S.: Artificial electric field algorithm for determining controller parameters in AVR system. In: 2019 International Artificial Intelligence and Data Processing Symposium (IDAP), pp. 1–7. IEEE (2019, September)
47.
go back to reference Demirören, A.; Ekinci, S.; Hekimoğlu, B.; Izci, D.: Opposition-based artificial electric field algorithm and its application to FOPID controller design for unstable magnetic ball suspension system. Eng. Sci. Technol. Int. J. 24, 469–479 (2020) Demirören, A.; Ekinci, S.; Hekimoğlu, B.; Izci, D.: Opposition-based artificial electric field algorithm and its application to FOPID controller design for unstable magnetic ball suspension system. Eng. Sci. Technol. Int. J. 24, 469–479 (2020)
48.
go back to reference Paul, S.; Das, S.: Simultaneous feature selection and weighting–an evolutionary multi-objective optimization approach. Pattern Recognit. Lett. 65, 51–59 (2015)CrossRef Paul, S.; Das, S.: Simultaneous feature selection and weighting–an evolutionary multi-objective optimization approach. Pattern Recognit. Lett. 65, 51–59 (2015)CrossRef
49.
go back to reference Cai, J.; Luo, J.; Wang, S.; Yang, S.: Feature selection in machine learning: a new perspective. Neurocomputing 300, 70–79 (2018)CrossRef Cai, J.; Luo, J.; Wang, S.; Yang, S.: Feature selection in machine learning: a new perspective. Neurocomputing 300, 70–79 (2018)CrossRef
50.
go back to reference Siedlecki, W.; Sklansky, J.: A note on genetic algorithms for large-scale feature selection. In: Handbook of Pattern Recognition and Computer Vision, pp. 88–107. (1993) Siedlecki, W.; Sklansky, J.: A note on genetic algorithms for large-scale feature selection. In: Handbook of Pattern Recognition and Computer Vision, pp. 88–107. (1993)
51.
go back to reference Li, R.; Lu, J.; Zhang, Y.; Zhao, T.: Dynamic Adaboost learning with feature selection based on parallel genetic algorithm for image annotation. Knowl. Based Syst. 23(3), 195–201 (2010)CrossRef Li, R.; Lu, J.; Zhang, Y.; Zhao, T.: Dynamic Adaboost learning with feature selection based on parallel genetic algorithm for image annotation. Knowl. Based Syst. 23(3), 195–201 (2010)CrossRef
52.
go back to reference Huang, C.L.; Dun, J.F.: A distributed PSO–SVM hybrid system with feature selection and parameter optimization. Appl. Soft Comput. 8(4), 1381–1391 (2008)CrossRef Huang, C.L.; Dun, J.F.: A distributed PSO–SVM hybrid system with feature selection and parameter optimization. Appl. Soft Comput. 8(4), 1381–1391 (2008)CrossRef
53.
go back to reference Wang, X.; Yang, J.; Teng, X.; Xia, W.; Jensen, R.: Feature selection based on rough sets and particle swarm optimization. Pattern Recognit. Lett. 28(4), 459–471 (2007)CrossRef Wang, X.; Yang, J.; Teng, X.; Xia, W.; Jensen, R.: Feature selection based on rough sets and particle swarm optimization. Pattern Recognit. Lett. 28(4), 459–471 (2007)CrossRef
54.
go back to reference Tabakhi, S.; Moradi, P.; Akhlaghian, F.: An unsupervised feature selection algorithm based on ant colony optimization. Eng. Appl. Artif. Intell. 32, 112–123 (2014)CrossRef Tabakhi, S.; Moradi, P.; Akhlaghian, F.: An unsupervised feature selection algorithm based on ant colony optimization. Eng. Appl. Artif. Intell. 32, 112–123 (2014)CrossRef
55.
go back to reference Dadaneh, B.Z.; Markid, H.Y.; Zakerolhosseini, A.: Unsupervised probabilistic feature selection using ant colony optimization. Expert Syst. Appl. 53, 27–42 (2016)CrossRef Dadaneh, B.Z.; Markid, H.Y.; Zakerolhosseini, A.: Unsupervised probabilistic feature selection using ant colony optimization. Expert Syst. Appl. 53, 27–42 (2016)CrossRef
57.
go back to reference Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7(Jan), 1–30 (2006)MathSciNetMATH Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7(Jan), 1–30 (2006)MathSciNetMATH
58.
go back to reference Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 32(200), 675–701 (1937)MATHCrossRef Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 32(200), 675–701 (1937)MATHCrossRef
59.
go back to reference Friedman, M.: A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 11(1), 86–92 (1940)MathSciNetMATHCrossRef Friedman, M.: A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 11(1), 86–92 (1940)MathSciNetMATHCrossRef
60.
go back to reference Iman, R.L.; Davenport, J.M.: Approximations of the critical region of the fbietkan statistic. Commun. Stat. Theory Methods 9(6), 571–595 (1980)MATHCrossRef Iman, R.L.; Davenport, J.M.: Approximations of the critical region of the fbietkan statistic. Commun. Stat. Theory Methods 9(6), 571–595 (1980)MATHCrossRef
61.
go back to reference García, S.; Fernández, A.; Luengo, J.; Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf. Sci. 180(10), 2044–2064 (2010)CrossRef García, S.; Fernández, A.; Luengo, J.; Herrera, F.: Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf. Sci. 180(10), 2044–2064 (2010)CrossRef
62.
go back to reference Luengo, J.; García, S.; Herrera, F.: A study on the use of statistical tests for experimentation with neural networks: analysis of parametric test conditions and non-parametric tests. Expert Syst. Appl 36(4), 7798–7808 (2009)CrossRef Luengo, J.; García, S.; Herrera, F.: A study on the use of statistical tests for experimentation with neural networks: analysis of parametric test conditions and non-parametric tests. Expert Syst. Appl 36(4), 7798–7808 (2009)CrossRef
Metadata
Title
Optimal Selection of Features Using Artificial Electric Field Algorithm for Classification
Authors
Himansu Das
Bighnaraj Naik
H. S. Behera
Publication date
11-03-2021
Publisher
Springer Berlin Heidelberg
Published in
Arabian Journal for Science and Engineering / Issue 9/2021
Print ISSN: 2193-567X
Electronic ISSN: 2191-4281
DOI
https://doi.org/10.1007/s13369-021-05486-x

Other articles of this Issue 9/2021

Arabian Journal for Science and Engineering 9/2021 Go to the issue

Research Article-Computer Engineering and Computer Science

Classification of Marine Plankton Based on Few-shot Learning

Technical Note-Computer Engineering and Computer Science

A Bit Addressable Register with Variable Write/Read Data widths

Premium Partners