Skip to main content
Erschienen in: The Journal of Supercomputing 16/2023

19.05.2023

A lightweight knowledge-based PSO for SVM hyper-parameters tuning in a dynamic environment

verfasst von: Dhruba Jyoti Kalita, Vibhav Prakash Singh, Vinay Kumar

Erschienen in: The Journal of Supercomputing | Ausgabe 16/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Hyper-parameter optimization is a crucial task for designing kernel-based machine learning models. Their values can be set by using various optimization algorithms. But a data-dependent objective function makes hyper-parameter’s configuration changes over time in a dynamic environment. A dynamic environment is an environment where training data keep getting added continuously over time. To find the optimum values for the hyper-parameters in such an environment, one needs to run the optimization algorithm repeatedly over time. But due to the dependency of the objective function on data, the average time complexity of the optimization process increases. This paper work proposes a novel knowledge-based approach that uses particle swarm optimization (PSO) as the base optimization algorithm to optimize the hyper-parameters of support vector machine. We have introduced two major modules for designing this framework—a knowledge transfer module and a drift detection module. The knowledge transfer module in our proposed framework generates knowledge by running PSO and transfers this knowledge to the consequent time instances. On the other hand, the drift detection module is responsible for detecting changes in the objective function when new data get added to the existing data. This drift detection module helps to utilize the transferred knowledge at a particular time instance to reduce the execution time of the overall optimization process. The proposed framework has been evaluated using various standard datasets such as Adult, DNA, Nist-Digits, Segment, Splice, Mushroom and Usps for five consecutive time instances and found the average execution time as 30.72 s, which is better than the execution time 36.89 s recorded using general PSO. We have also found that our proposed framework performs the optimization of hyper-parameters much faster than the other existing approaches such as grid search, chained-PSO, dynamic model selection and quantized dynamic multi-PSO.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Yang L, Shami A (2020) On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415:295–316CrossRef Yang L, Shami A (2020) On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415:295–316CrossRef
2.
Zurück zum Zitat Kalita DJ, Singh VP, Kumar V (2020) A survey on SVM hyper-parameters optimization techniques. In: Social networking and computational intelligence: proceedings of SCI-2018. Springer Singapore. (pp. 243-256) Kalita DJ, Singh VP, Kumar V (2020) A survey on SVM hyper-parameters optimization techniques. In: Social networking and computational intelligence: proceedings of SCI-2018. Springer Singapore. (pp. 243-256)
3.
Zurück zum Zitat Eberhart R, Kennedy J (1995) Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, (Vol. 4, pp. 1942–1948) Eberhart R, Kennedy J (1995) Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, (Vol. 4, pp. 1942–1948)
4.
Zurück zum Zitat Vapnik V, Vapnik V (1998) Statistical learning theory (pp. 156–160) Vapnik V, Vapnik V (1998) Statistical learning theory (pp. 156–160)
5.
Zurück zum Zitat Cervantes J, Garcia-Lamont F, Rodríguez-Mazahua L, Lopez A (2020) A comprehensive survey on support vector machine classification: applications, challenges and trends. Neurocomputing 408:189–215CrossRef Cervantes J, Garcia-Lamont F, Rodríguez-Mazahua L, Lopez A (2020) A comprehensive survey on support vector machine classification: applications, challenges and trends. Neurocomputing 408:189–215CrossRef
6.
Zurück zum Zitat Feurer M, Hutter F (2019) Hyperparameter optimization. In: Automated machine learning. Springer, Cham. (pp. 3–33) Feurer M, Hutter F (2019) Hyperparameter optimization. In: Automated machine learning. Springer, Cham. (pp. 3–33)
7.
Zurück zum Zitat Kalita DJ, Singh S (2020) SVM hyper-parameters optimization using quantized multi-PSO in dynamic environment. Soft Comput 24(2):1225–1241CrossRef Kalita DJ, Singh S (2020) SVM hyper-parameters optimization using quantized multi-PSO in dynamic environment. Soft Comput 24(2):1225–1241CrossRef
8.
Zurück zum Zitat Kalita DJ, Singh VP, Kumar V (2021) A dynamic framework for tuning SVM hyper parameters based on moth-flame optimization and knowledge-based-search. Expert Syst Appl 168:114139CrossRef Kalita DJ, Singh VP, Kumar V (2021) A dynamic framework for tuning SVM hyper parameters based on moth-flame optimization and knowledge-based-search. Expert Syst Appl 168:114139CrossRef
9.
Zurück zum Zitat Refaeilzadeh P, Tang L, Liu H (2009) Cross-validation. Encyclop Database Syst 5:532–538CrossRef Refaeilzadeh P, Tang L, Liu H (2009) Cross-validation. Encyclop Database Syst 5:532–538CrossRef
10.
Zurück zum Zitat Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159CrossRefMATH Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159CrossRefMATH
11.
Zurück zum Zitat Ayat N, Cheriet M, Suen C (2005) Automatic model selection for the optimization of SVM kernels. Patt Recogn 38(10):1733–1745CrossRef Ayat N, Cheriet M, Suen C (2005) Automatic model selection for the optimization of SVM kernels. Patt Recogn 38(10):1733–1745CrossRef
12.
Zurück zum Zitat Alibrahim H, Ludwig SA (2021) Hyperparameter optimization: Comparing genetic algorithm against grid search and bayesian optimization. In: 2021 IEEE congress on evolutionary computation (CEC), IEEE, pp. 1551–1559 Alibrahim H, Ludwig SA (2021) Hyperparameter optimization: Comparing genetic algorithm against grid search and bayesian optimization. In: 2021 IEEE congress on evolutionary computation (CEC), IEEE, pp. 1551–1559
13.
Zurück zum Zitat Huang CM, Lee YJ, Lin DK, Huang SY (2007) Model selection for support vector machines via uniform design. Comput Stat Data Anal 52(1):335–346MathSciNetCrossRefMATH Huang CM, Lee YJ, Lin DK, Huang SY (2007) Model selection for support vector machines via uniform design. Comput Stat Data Anal 52(1):335–346MathSciNetCrossRefMATH
14.
Zurück zum Zitat Huang Q, Mao J, Liu Y (2012) An improved grid search algorithm of SVR parameters optimization. In: 2012 IEEE 14th International Conference on Communication Technology, IEEE, pp. 1022–1026 Huang Q, Mao J, Liu Y (2012) An improved grid search algorithm of SVR parameters optimization. In: 2012 IEEE 14th International Conference on Communication Technology, IEEE, pp. 1022–1026
15.
Zurück zum Zitat Chunhong Z, Licheng J (2004) Automatic parameters selection for SVM based on GA. In: Proceedings of the 5th world congress on intelligent control and automation, pp. 1869–1872 Chunhong Z, Licheng J (2004) Automatic parameters selection for SVM based on GA. In: Proceedings of the 5th world congress on intelligent control and automation, pp. 1869–1872
16.
Zurück zum Zitat Cohen G, Hilario M, Geissbuhler A (2004) Model selection for support vector classifiers via genetic algorithms. An application to medical decision support. In: Proceedings of the 5th international symposium on biological and medical data analysis, pp.200–211 Cohen G, Hilario M, Geissbuhler A (2004) Model selection for support vector classifiers via genetic algorithms. An application to medical decision support. In: Proceedings of the 5th international symposium on biological and medical data analysis, pp.200–211
18.
Zurück zum Zitat Friedrichs F, Igel C (2004) Evolutionary tuning of multiple SVM parameters, In: Proceedings of the 12th european symposium on artificial neural networks, pp- 519–524 Friedrichs F, Igel C (2004) Evolutionary tuning of multiple SVM parameters, In: Proceedings of the 12th european symposium on artificial neural networks, pp- 519–524
19.
Zurück zum Zitat Lin SW, Lee ZJ, Chen SC, Tseng TY (2008) Parameter determination of support vector machine and feature selection using simulated annealing approach. Appl Soft Comput 8(4):1505–1512CrossRef Lin SW, Lee ZJ, Chen SC, Tseng TY (2008) Parameter determination of support vector machine and feature selection using simulated annealing approach. Appl Soft Comput 8(4):1505–1512CrossRef
20.
Zurück zum Zitat Rojas-Domínguez A, Padierna LC, Valadez JMC, Puga-Soberanes HJ, Fraire HJ (2017) Optimal hyper-parameter tuning of SVM classifiers with application to medical diagnosis. IEEE Access 6:7164–7176CrossRef Rojas-Domínguez A, Padierna LC, Valadez JMC, Puga-Soberanes HJ, Fraire HJ (2017) Optimal hyper-parameter tuning of SVM classifiers with application to medical diagnosis. IEEE Access 6:7164–7176CrossRef
22.
Zurück zum Zitat Zhang X, Chen X, He Z (2010) An ACO-based algorithm for parameter optimization of support vector machines. Expert Syst Appl 37(9):6618–6628CrossRef Zhang X, Chen X, He Z (2010) An ACO-based algorithm for parameter optimization of support vector machines. Expert Syst Appl 37(9):6618–6628CrossRef
23.
Zurück zum Zitat Dioşan L, Rogozan A, Pecuchet JP (2012) Improving classification performance of support vector machine by genetically optimising kernel shape and hyper-parameters. Appl Intell 36(2):280–294CrossRef Dioşan L, Rogozan A, Pecuchet JP (2012) Improving classification performance of support vector machine by genetically optimising kernel shape and hyper-parameters. Appl Intell 36(2):280–294CrossRef
24.
Zurück zum Zitat Candelieri A, Giordani I, Archetti F, Barkalov K, Meyerov I, Polovinkin A, Zolotykh N (2019) Tuning hyperparameters of a SVM-based water demand forecasting system through parallel global optimization. Comput Op Res 106:202–209MathSciNetCrossRef Candelieri A, Giordani I, Archetti F, Barkalov K, Meyerov I, Polovinkin A, Zolotykh N (2019) Tuning hyperparameters of a SVM-based water demand forecasting system through parallel global optimization. Comput Op Res 106:202–209MathSciNetCrossRef
25.
Zurück zum Zitat Wu CH, Tzeng GH, Goo YJ, Fang WC (2007) A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Syst Appl 32(2):397–408CrossRef Wu CH, Tzeng GH, Goo YJ, Fang WC (2007) A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Syst Appl 32(2):397–408CrossRef
26.
Zurück zum Zitat Phan AV, Le Nguyen M, Bui LT (2017) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 46(2):455–469CrossRef Phan AV, Le Nguyen M, Bui LT (2017) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 46(2):455–469CrossRef
27.
Zurück zum Zitat Tharwat A, Hassanien AE (2018) Chaotic antlion algorithm for parameter optimization of support vector machine. Appl Intell 48(3):670–686CrossRef Tharwat A, Hassanien AE (2018) Chaotic antlion algorithm for parameter optimization of support vector machine. Appl Intell 48(3):670–686CrossRef
28.
Zurück zum Zitat Rai P, Daum´ e H, Venkatasubramanian S (2009) Streamed learning: One-pass SVMs. In: Twenty-First International Joint Conference on Artificial Intelligence Rai P, Daum´ e H, Venkatasubramanian S (2009) Streamed learning: One-pass SVMs. In: Twenty-First International Joint Conference on Artificial Intelligence
29.
Zurück zum Zitat Kapp MN, Sabourin R, Maupin P (2012) A dynamic model selection strategy for support vector machine classifiers. Appl Soft Comput 12(8):2550–2565CrossRef Kapp MN, Sabourin R, Maupin P (2012) A dynamic model selection strategy for support vector machine classifiers. Appl Soft Comput 12(8):2550–2565CrossRef
30.
Zurück zum Zitat Li J, Chen X (2012) Online learning algorithm of direct support vector machine for regression based on matrix operation. In: Zhang T (ed) Instrumentation, measurement, circuits and systems. Springer, Berlin Li J, Chen X (2012) Online learning algorithm of direct support vector machine for regression based on matrix operation. In: Zhang T (ed) Instrumentation, measurement, circuits and systems. Springer, Berlin
31.
Zurück zum Zitat Hitam NA, Ismail AR, Saeed F (2019) An optimized support vector machine (SVM) based on particle swarm optimization (PSO) for cryptocurrency forecasting. Procedia Comput Sci 163:427–433CrossRef Hitam NA, Ismail AR, Saeed F (2019) An optimized support vector machine (SVM) based on particle swarm optimization (PSO) for cryptocurrency forecasting. Procedia Comput Sci 163:427–433CrossRef
32.
Zurück zum Zitat Kalita DJ, Singh VP, Kumar V (2020) SVM hyper-parameters optimization using multi-PSO for intrusion detection. In: Social networking and computational intelligence: proceedings of SCI-2018. Springer Singapore, pp. 227–241 Kalita DJ, Singh VP, Kumar V (2020) SVM hyper-parameters optimization using multi-PSO for intrusion detection. In: Social networking and computational intelligence: proceedings of SCI-2018. Springer Singapore, pp. 227–241
33.
Zurück zum Zitat Sudheer C, Maheswaran R, Panigrahi BK, Mathur S (2014) A hybrid SVM-PSO model for forecasting monthly streamflow. Neural Comput Appl 24:1381–1389CrossRef Sudheer C, Maheswaran R, Panigrahi BK, Mathur S (2014) A hybrid SVM-PSO model for forecasting monthly streamflow. Neural Comput Appl 24:1381–1389CrossRef
34.
Zurück zum Zitat Black M, Hickey RJ (1999) Maintaining the performance of a learned classifier under concept drift. Intell Data Anal 3(6):453–474CrossRef Black M, Hickey RJ (1999) Maintaining the performance of a learned classifier under concept drift. Intell Data Anal 3(6):453–474CrossRef
35.
Zurück zum Zitat Last M (2002) Online classification of nonstationary data streams. Intell Data Anal 6(2):129–147CrossRefMATH Last M (2002) Online classification of nonstationary data streams. Intell Data Anal 6(2):129–147CrossRefMATH
36.
Zurück zum Zitat Cohen L, Avrahami-Bakish G, Last M, Kandel A, Kipersztok O (2008) Real-time data mining of non-stationary data streams from sensor networks. Inform Fusion 9(3):344–353CrossRef Cohen L, Avrahami-Bakish G, Last M, Kandel A, Kipersztok O (2008) Real-time data mining of non-stationary data streams from sensor networks. Inform Fusion 9(3):344–353CrossRef
Metadaten
Titel
A lightweight knowledge-based PSO for SVM hyper-parameters tuning in a dynamic environment
verfasst von
Dhruba Jyoti Kalita
Vibhav Prakash Singh
Vinay Kumar
Publikationsdatum
19.05.2023
Verlag
Springer US
Erschienen in
The Journal of Supercomputing / Ausgabe 16/2023
Print ISSN: 0920-8542
Elektronische ISSN: 1573-0484
DOI
https://doi.org/10.1007/s11227-023-05385-y

Weitere Artikel der Ausgabe 16/2023

The Journal of Supercomputing 16/2023 Zur Ausgabe

Premium Partner