Skip to main content
Top
Published in: The Review of Socionetwork Strategies 2/2023

15-10-2023 | Article

Application of a Stochastic Schemata Exploiter for Multi-Objective Hyper-parameter Optimization of Machine Learning

Authors: Hiroya Makino, Eisuke Kita

Published in: The Review of Socionetwork Strategies | Issue 2/2023

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The Stochastic Schemata Exploiter (SSE), one of the Evolutionary Algorithms, is designed to find the optimal solution of a function. SSE extracts common schemata from individual sets with high fitness and generates individuals from the common schemata. For hyper-parameter optimization, the initialization method, the schema extraction method, and the new individual generation method, which are characteristic processes in SSE, are extended. In this paper, an SSE-based multi-objective optimization for AutoML is proposed. AutoML gives good results in terms of model accuracy. However, if only model accuracy is considered, the model may be too complex. Such complex models cannot always be allowed because of the long computation time. The proposed method maximizes the stacking model accuracy and minimizes the model complexity simultaneously. When compared with existing methods, SSE has interesting features such as fewer control parameters and faster convergence properties. The visualization method makes the optimization process transparent and helps users understand the process.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
2.
go back to reference Makino, H., & Kita, E. (2021). Stochastic schemata exploiter-based AutoML. In: Proceedings of the 2021 International Conference on Data Mining Workshops (ICDMW), pp. 238–245. Makino, H., & Kita, E. (2021). Stochastic schemata exploiter-based AutoML. In: Proceedings of the 2021 International Conference on Data Mining Workshops (ICDMW), pp. 238–245.
5.
go back to reference Maruyama, T., & Kita, E. (2007). Extension of stochastic schemata exploiter to real-valued problem. In: Proceedings of Computer Aided Optimum Design in Engineering X, pp. 45–53. Maruyama, T., & Kita, E. (2007). Extension of stochastic schemata exploiter to real-valued problem. In: Proceedings of Computer Aided Optimum Design in Engineering X, pp. 45–53.
6.
go back to reference Maruyama, T., & Kita, E. (2007). Investigation of real-valued stochastic schemata exploiter. Information Processing Society of Japan Transactions on Mathematical Modeling and its Applications, 48(SIG19(TOM19)), 10–22. Maruyama, T., & Kita, E. (2007). Investigation of real-valued stochastic schemata exploiter. Information Processing Society of Japan Transactions on Mathematical Modeling and its Applications, 48(SIG19(TOM19)), 10–22.
7.
go back to reference Kotthoff, L., Thornton, C., Hoos, H. H., Hutter, F., Leyton-Brown, K. (2019). Auto-WEKA: Automatic model selection and hyperparameter optimization in WEKA. In: Automated machine learning: methods, systems, challenges (pp. 81–95). Springer. Kotthoff, L., Thornton, C., Hoos, H. H., Hutter, F., Leyton-Brown, K. (2019). Auto-WEKA: Automatic model selection and hyperparameter optimization in WEKA. In: Automated machine learning: methods, systems, challenges (pp. 81–95). Springer.
8.
go back to reference Ledell, E., & Poirier, S. (2020). H2O AutoML: Scalable automatic machine learning. In: 7th ICML Workshop on Automated Machine Learning, pp. 1–16. Ledell, E., & Poirier, S. (2020). H2O AutoML: Scalable automatic machine learning. In: 7th ICML Workshop on Automated Machine Learning, pp. 1–16.
9.
go back to reference Maziarz, K., Tan, M., Khorlin, A., Georgiev, M., & Gesmundo, A. (2018). Evolutionary-neural hybrid agents for architecture search. arXiv preprint arXiv:1811.09828. Maziarz, K., Tan, M., Khorlin, A., Georgiev, M., & Gesmundo, A. (2018). Evolutionary-neural hybrid agents for architecture search. arXiv preprint arXiv:​1811.​09828.
10.
go back to reference Zoph, B., Vasudevan, V., Shlens, J., & Le, Q. V. (2018). Learning transferable architectures for scalable image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8697–8710. Zoph, B., Vasudevan, V., Shlens, J., & Le, Q. V. (2018). Learning transferable architectures for scalable image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8697–8710.
11.
go back to reference Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861. Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:​1704.​04861.
12.
go back to reference Holland, J. H. (1992). Adaptation in natural and artificial systems: An introductory analysis with applications to biology, control, and artificial intelligence. MIT Press.CrossRef Holland, J. H. (1992). Adaptation in natural and artificial systems: An introductory analysis with applications to biology, control, and artificial intelligence. MIT Press.CrossRef
15.
go back to reference Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13(1), 281–305. Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13(1), 281–305.
19.
go back to reference Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical bayesian optimization of machine learning algorithms. arXiv preprint arXiv:1206.2944. Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical bayesian optimization of machine learning algorithms. arXiv preprint arXiv:​1206.​2944.
20.
go back to reference Feurer, M., & Hutter, F. (2019). Hyperparameter optimization. In: F. Hutter, L. Kotthoff, J. Vanschoren (eds.), Automated machine learning: Methods, systems, challenges, (pp. 3–33). Springer. Feurer, M., & Hutter, F. (2019). Hyperparameter optimization. In: F. Hutter, L. Kotthoff, J. Vanschoren (eds.), Automated machine learning: Methods, systems, challenges, (pp. 3–33). Springer.
22.
23.
go back to reference Zhang, L. M. (2019). A new compensatory genetic algorithm-based method for effective compressed multi-function convolutional neural network model selection with multi-objective optimization. arXiv preprint arXiv:1906.11912, 1–13. Zhang, L. M. (2019). A new compensatory genetic algorithm-based method for effective compressed multi-function convolutional neural network model selection with multi-objective optimization. arXiv preprint arXiv:​1906.​11912, 1–13.
24.
go back to reference Laredo, D., Qin, Y., Schütze, O., & Sun, J.-Q. (2019). Automatic model selection for neural networks. arXiv preprint arXiv:1905.06010, 1–31. Laredo, D., Qin, Y., Schütze, O., & Sun, J.-Q. (2019). Automatic model selection for neural networks. arXiv preprint arXiv:​1905.​06010, 1–31.
26.
go back to reference Vargas, D. V., & Kotyan, S. (2019). Evolving robust neural architectures to defend from adversarial attacks. arXiv preprint arXiv:1906.11667. Vargas, D. V., & Kotyan, S. (2019). Evolving robust neural architectures to defend from adversarial attacks. arXiv preprint arXiv:​1906.​11667.
28.
go back to reference Lu, Z., Whalen, I., Boddeti, V., Dhebar, Y., Deb, K., Goodman, E., & Banzhaf, W. (2019). NSGA-Net: Neural architecture search using multi-objective genetic algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 419–427. Lu, Z., Whalen, I., Boddeti, V., Dhebar, Y., Deb, K., Goodman, E., & Banzhaf, W. (2019). NSGA-Net: Neural architecture search using multi-objective genetic algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 419–427.
29.
go back to reference Hsu, C.-H., Chang, S.-H., Liang, J.-H., Chou, H.-P., Liu, C.-H., Chang, S.-C., Pan, J.-Y., Chen, Y.-T., Wei, W., & Juan, D.-C. (2018). MONAS: Multi-objective neural architecture search using reinforcement learning. arXiv preprint arXiv:1806.10332. Hsu, C.-H., Chang, S.-H., Liang, J.-H., Chou, H.-P., Liu, C.-H., Chang, S.-C., Pan, J.-Y., Chen, Y.-T., Wei, W., & Juan, D.-C. (2018). MONAS: Multi-objective neural architecture search using reinforcement learning. arXiv preprint arXiv:​1806.​10332.
30.
go back to reference Liang, J., Meyerson, E., Hodjat, B., Fink, D., Mutch, K., & Miikkulainen, R. (2019). Evolutionary neural AutoML for deep learning. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 401–409. Liang, J., Meyerson, E., Hodjat, B., Fink, D., Mutch, K., & Miikkulainen, R. (2019). Evolutionary neural AutoML for deep learning. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 401–409.
33.
go back to reference Dong, J. D., Cheng, A. C., Juan, D. C., Wei, W., & Sun, M. (2018). PPP-Net: Platform-aware progressive search for pareto-optimal neural architectures. In: International Conference on Learning Representations (ICLR) Workshop 2018, pp. 1–4. Dong, J. D., Cheng, A. C., Juan, D. C., Wei, W., & Sun, M. (2018). PPP-Net: Platform-aware progressive search for pareto-optimal neural architectures. In: International Conference on Learning Representations (ICLR) Workshop 2018, pp. 1–4.
34.
go back to reference Hutter, F., Hoos, H. H., & Leyton-Brown, K. (2011). Sequential model-based optimization for general algorithm configuration. In: International Conference on learning and intelligent optimization, pp. 507–523. Hutter, F., Hoos, H. H., & Leyton-Brown, K. (2011). Sequential model-based optimization for general algorithm configuration. In: International Conference on learning and intelligent optimization, pp. 507–523.
37.
go back to reference Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, É. (2011). Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12, 2825–2830. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., & Duchesnay, É. (2011). Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12, 2825–2830.
39.
go back to reference Feng, X., Zhao, J., & Kita, E. (2019). Genetic algorithm based optimization of deep neural network ensemble for personal identification in pedestrians behaviors. In: Proceedings of the 2019 International Conference on Data Mining Workshops (ICDMW), pp. 318–325. https://doi.org/10.1109/ICDMW.2019.00054 Feng, X., Zhao, J., & Kita, E. (2019). Genetic algorithm based optimization of deep neural network ensemble for personal identification in pedestrians behaviors. In: Proceedings of the 2019 International Conference on Data Mining Workshops (ICDMW), pp. 318–325. https://​doi.​org/​10.​1109/​ICDMW.​2019.​00054
40.
go back to reference Elsken, T., Metzen, J. H., & Hutter, F. (2019). Neural architecture search. In: F. Hutter, L. Kotthoff, J. Vanschoren (eds.), Automated machine learning: Methods, systems, challenges, (pp. 63–77). Springer.CrossRef Elsken, T., Metzen, J. H., & Hutter, F. (2019). Neural architecture search. In: F. Hutter, L. Kotthoff, J. Vanschoren (eds.), Automated machine learning: Methods, systems, challenges, (pp. 63–77). Springer.CrossRef
43.
go back to reference Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, pp. 1–14. Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:​1409.​1556, pp. 1–14.
44.
go back to reference Kim, J., Lee, J. K., & Lee, K. M. (2016). Accurate image super-resolution using very deep convolutional networks. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, pp. 1646–1654. Kim, J., Lee, J. K., & Lee, K. M. (2016). Accurate image super-resolution using very deep convolutional networks. In: Proceedings of the IEEE Conference on computer vision and pattern recognition, pp. 1646–1654.
46.
go back to reference Nair, V., & Hinton, G. E. (2010). Rectified linear units improve restricted Boltzmann machines. In: Proceedings of the 27th International Conference on machine learning, pp. 807–814. Nair, V., & Hinton, G. E. (2010). Rectified linear units improve restricted Boltzmann machines. In: Proceedings of the 27th International Conference on machine learning, pp. 807–814.
48.
go back to reference Chen, T., & Guestrin, C. (2016). XGBoost: A scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on knowledge discovery and data mining, pp. 785–794. Chen, T., & Guestrin, C. (2016). XGBoost: A scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on knowledge discovery and data mining, pp. 785–794.
49.
go back to reference Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., & Liu, T. Y. (2017). LightGBM: A highly efficient gradient boosting decision tree. Advances in Neural Information Processing Systems, 30, 3149–3157. Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., & Liu, T. Y. (2017). LightGBM: A highly efficient gradient boosting decision tree. Advances in Neural Information Processing Systems, 30, 3149–3157.
51.
go back to reference Nash, W. J., Sellers, T. L., Talbot, S. R., Cawthorn, A. J., & Ford, W. B. (1994). The population biology of abalone (Haliotis species) in Tasmania. i. Blacklip abalone (H. rubra) from the north coast and islands of bass strait. Sea fisheries division, technical report 48. Nash, W. J., Sellers, T. L., Talbot, S. R., Cawthorn, A. J., & Ford, W. B. (1994). The population biology of abalone (Haliotis species) in Tasmania. i. Blacklip abalone (H. rubra) from the north coast and islands of bass strait. Sea fisheries division, technical report 48.
52.
go back to reference Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547–553.CrossRef Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547–553.CrossRef
Metadata
Title
Application of a Stochastic Schemata Exploiter for Multi-Objective Hyper-parameter Optimization of Machine Learning
Authors
Hiroya Makino
Eisuke Kita
Publication date
15-10-2023
Publisher
Springer Nature Singapore
Published in
The Review of Socionetwork Strategies / Issue 2/2023
Print ISSN: 2523-3173
Electronic ISSN: 1867-3236
DOI
https://doi.org/10.1007/s12626-023-00151-1

Other articles of this Issue 2/2023

The Review of Socionetwork Strategies 2/2023 Go to the issue

Premium Partner