Skip to main content

27.04.2024

HyperTuner: a cross-layer multi-objective hyperparameter auto-tuning framework for data analytic services

verfasst von: Hui Dou, Shanshan Zhu, Yiwen Zhang, Pengfei Chen, Zibin Zheng

Erschienen in: The Journal of Supercomputing

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Hyperparameters optimization (HPO) is vital for machine learning models. Besides model accuracy, other tuning intentions such as model training time and energy consumption are also worthy of attention from data analytic service providers. Therefore, it is essential to take both model hyperparameters and system parameters into consideration to execute cross-layer multi-objective hyperparameter auto-tuning. Toward this challenging target, we propose HyperTuner in this paper which leverages a well-designed ADUMBO algorithm to find the Pareto-optimal configuration set. Compared with vanilla Bayesian optimization-based methods, ADUMBO selects the most promising configuration from the generated Pareto candidate set during each iteration via maximizing a novel adaptive uncertainty metric. We evaluate HyperTuner on our local distributed TensorFlow cluster, and experimental results show that it is always able to find a better Pareto configuration front superior in both convergence and diversity compared with the other four baseline algorithms. Besides, experiments with different training datasets, different optimization objectives, and different machine learning platforms verify that HyperTuner can well adapt to various data analytic service scenarios.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Pouyanfar S, Sadiq S, Yan Y, Tian H, Tao Y, Reyes MP, Shyu M-L, Chen S-C, Iyengar SS (2018) A survey on deep learning: algorithms, techniques, and applications. ACM Comput Surv (CSUR) 51(5):1–36CrossRef Pouyanfar S, Sadiq S, Yan Y, Tian H, Tao Y, Reyes MP, Shyu M-L, Chen S-C, Iyengar SS (2018) A survey on deep learning: algorithms, techniques, and applications. ACM Comput Surv (CSUR) 51(5):1–36CrossRef
2.
Zurück zum Zitat Pang G, Shen C, Cao L, Hengel AVD (2021) Deep learning for anomaly detection: a review. ACM Comput Surv (CSUR) 54(2):1–38CrossRef Pang G, Shen C, Cao L, Hengel AVD (2021) Deep learning for anomaly detection: a review. ACM Comput Surv (CSUR) 54(2):1–38CrossRef
3.
Zurück zum Zitat Kotthoff L, Thornton C, Hoos HH, Hutter F, Leyton-Brown K (2019) Auto-weka: Automatic model selection and hyperparameter optimization in weka. Autom Mach Learn Meth Syst Challenges, 81–95 Kotthoff L, Thornton C, Hoos HH, Hutter F, Leyton-Brown K (2019) Auto-weka: Automatic model selection and hyperparameter optimization in weka. Autom Mach Learn Meth Syst Challenges, 81–95
4.
Zurück zum Zitat Li L, Jamieson K, DeSalvo G, Rostamizadeh A, Talwalkar A (2017) Hyperband: a novel bandit-based approach to hyperparameter optimization. J Mach Learn Res 18(1):6765–6816MathSciNet Li L, Jamieson K, DeSalvo G, Rostamizadeh A, Talwalkar A (2017) Hyperband: a novel bandit-based approach to hyperparameter optimization. J Mach Learn Res 18(1):6765–6816MathSciNet
5.
Zurück zum Zitat Falkner S, Klein A, Hutter F (2018) Bohb: Robust and efficient hyperparameter optimization at scale. In: International Conference on Machine Learning, pp. 1437–1446. PMLR Falkner S, Klein A, Hutter F (2018) Bohb: Robust and efficient hyperparameter optimization at scale. In: International Conference on Machine Learning, pp. 1437–1446. PMLR
6.
Zurück zum Zitat Akiba T, Sano S, Yanase T, Ohta T, Koyama M (2019) Optuna: A next-generation hyperparameter optimization framework. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2623–2631 Akiba T, Sano S, Yanase T, Ohta T, Koyama M (2019) Optuna: A next-generation hyperparameter optimization framework. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2623–2631
7.
Zurück zum Zitat Strubell E, Ganesh A, McCallum A (2019) Energy and policy considerations for deep learning in nlp. arXiv preprint arXiv:1906.02243 Strubell E, Ganesh A, McCallum A (2019) Energy and policy considerations for deep learning in nlp. arXiv preprint arXiv:​1906.​02243
8.
Zurück zum Zitat Morales-Hernández A, Van Nieuwenhuyse I, Rojas Gonzalez S (2022) A survey on multi-objective hyperparameter optimization algorithms for machine learning. Artif Intell Rev 1–51 Morales-Hernández A, Van Nieuwenhuyse I, Rojas Gonzalez S (2022) A survey on multi-objective hyperparameter optimization algorithms for machine learning. Artif Intell Rev 1–51
9.
Zurück zum Zitat Smithson SC, Yang G, Gross WJ, Meyer BH (2016) Neural networks designing neural networks: multi-objective hyper-parameter optimization. In: 2016 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 1–8. IEEE Smithson SC, Yang G, Gross WJ, Meyer BH (2016) Neural networks designing neural networks: multi-objective hyper-parameter optimization. In: 2016 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 1–8. IEEE
10.
Zurück zum Zitat Zuluaga M, Sergent G, Krause A, Püschel M (2013) Active learning for multi-objective optimization. In: International Conference on Machine Learning, pp. 462–470. PMLR Zuluaga M, Sergent G, Krause A, Püschel M (2013) Active learning for multi-objective optimization. In: International Conference on Machine Learning, pp. 462–470. PMLR
11.
Zurück zum Zitat Hernández-Lobato D, Hernandez-Lobato J, Shah A, Adams R (2016) Predictive entropy search for multi-objective bayesian optimization. In: International Conference on Machine Learning, pp. 1492–1501. PMLR Hernández-Lobato D, Hernandez-Lobato J, Shah A, Adams R (2016) Predictive entropy search for multi-objective bayesian optimization. In: International Conference on Machine Learning, pp. 1492–1501. PMLR
12.
Zurück zum Zitat Emmerich MT, Giannakoglou KC, Naujoks B (2006) Single-and multiobjective evolutionary optimization assisted by gaussian random field metamodels. IEEE Trans Evolut Comput 10(4):421–439CrossRef Emmerich MT, Giannakoglou KC, Naujoks B (2006) Single-and multiobjective evolutionary optimization assisted by gaussian random field metamodels. IEEE Trans Evolut Comput 10(4):421–439CrossRef
13.
Zurück zum Zitat Ponweiser W, Wagner T, Biermann D, Vincze M (2008) Multiobjective optimization on a limited budget of evaluations using model-assisted \(\{\)S\(\}\) -metric selection. In: International Conference on Parallel Problem Solving from Nature, pp. 784–794. Springer Ponweiser W, Wagner T, Biermann D, Vincze M (2008) Multiobjective optimization on a limited budget of evaluations using model-assisted \(\{\)S\(\}\) -metric selection. In: International Conference on Parallel Problem Solving from Nature, pp. 784–794. Springer
14.
Zurück zum Zitat Iqbal MS, Su J, Kotthoff L, Jamshidi P (2020) Flexibo: Cost-aware multi-objective optimization of deep neural networks. arXiv preprint arXiv:2001.06588 Iqbal MS, Su J, Kotthoff L, Jamshidi P (2020) Flexibo: Cost-aware multi-objective optimization of deep neural networks. arXiv preprint arXiv:​2001.​06588
15.
Zurück zum Zitat Laumanns M, Thiele L, Deb K, Zitzler E (2002) Combining convergence and diversity in evolutionary multiobjective optimization. Evolut Comput 10(3):263–282CrossRef Laumanns M, Thiele L, Deb K, Zitzler E (2002) Combining convergence and diversity in evolutionary multiobjective optimization. Evolut Comput 10(3):263–282CrossRef
16.
Zurück zum Zitat Jiang S, Yang S (2016) Convergence versus diversity in multiobjective optimization. In: Parallel Problem Solving from Nature–PPSN XIV: 14th International Conference, Edinburgh, UK, September 17-21, 2016, Proceedings 14, pp. 984–993. Springer Jiang S, Yang S (2016) Convergence versus diversity in multiobjective optimization. In: Parallel Problem Solving from Nature–PPSN XIV: 14th International Conference, Edinburgh, UK, September 17-21, 2016, Proceedings 14, pp. 984–993. Springer
17.
Zurück zum Zitat Hasabnis N (2018) Auto-tuning tensorflow threading model for cpu backend. In: 2018 IEEE/ACM Machine Learning in HPC Environments (MLHPC), pp. 14–25. IEEE Hasabnis N (2018) Auto-tuning tensorflow threading model for cpu backend. In: 2018 IEEE/ACM Machine Learning in HPC Environments (MLHPC), pp. 14–25. IEEE
18.
Zurück zum Zitat Spantidi O, Galanis I, Anagnostopoulos I (2020) Frequency-based power efficiency improvement of cnns on heterogeneous iot computing systems. In: 2020 IEEE 6th World Forum on Internet of Things (WF-IoT), pp. 1–6. IEEE Spantidi O, Galanis I, Anagnostopoulos I (2020) Frequency-based power efficiency improvement of cnns on heterogeneous iot computing systems. In: 2020 IEEE 6th World Forum on Internet of Things (WF-IoT), pp. 1–6. IEEE
19.
Zurück zum Zitat Tang Z, Wang Y, Wang Q, Chu X (2019) The impact of gpu dvfs on the energy and performance of deep learning: An empirical study. In: Proceedings of the Tenth ACM International Conference on Future Energy Systems, pp. 315–325 Tang Z, Wang Y, Wang Q, Chu X (2019) The impact of gpu dvfs on the energy and performance of deep learning: An empirical study. In: Proceedings of the Tenth ACM International Conference on Future Energy Systems, pp. 315–325
20.
Zurück zum Zitat Stamoulis D, Cai E, Juan D-C, Marculescu D (2018) Hyperpower: Power-and memory-constrained hyper-parameter optimization for neural networks. In: 2018 Design, Automation & Test in Europe Conference & Exhibition (DATE), pp. 19–24. IEEE Stamoulis D, Cai E, Juan D-C, Marculescu D (2018) Hyperpower: Power-and memory-constrained hyper-parameter optimization for neural networks. In: 2018 Design, Automation & Test in Europe Conference & Exhibition (DATE), pp. 19–24. IEEE
21.
Zurück zum Zitat Capra M, Bussolino B, Marchisio A, Masera G, Martina M, Shafique M (2020) Hardware and software optimizations for accelerating deep neural networks: survey of current trends, challenges, and the road ahead. IEEE Access 8:225134–225180CrossRef Capra M, Bussolino B, Marchisio A, Masera G, Martina M, Shafique M (2020) Hardware and software optimizations for accelerating deep neural networks: survey of current trends, challenges, and the road ahead. IEEE Access 8:225134–225180CrossRef
22.
Zurück zum Zitat Linux Kernel (2023) https://www.kernel.org/doc/html/v4.14/admin-guide/pm/cpufreq.html. Accessed on Feb 20, Linux Kernel (2023) https://​www.​kernel.​org/​doc/​html/​v4.​14/​admin-guide/​pm/​cpufreq.​html.​ Accessed on Feb 20,
23.
Zurück zum Zitat Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Trans Evolut Comput 6(2):182–197CrossRef Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Trans Evolut Comput 6(2):182–197CrossRef
24.
Zurück zum Zitat Srinivas N, Deb K (1994) Muiltiobjective optimization using nondominated sorting in genetic algorithms. Evolut Comput 2(3):221–248CrossRef Srinivas N, Deb K (1994) Muiltiobjective optimization using nondominated sorting in genetic algorithms. Evolut Comput 2(3):221–248CrossRef
25.
Zurück zum Zitat Zhang Q, Li H (2007) Moea/d: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evolut Comput 11(6):712–731CrossRef Zhang Q, Li H (2007) Moea/d: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evolut Comput 11(6):712–731CrossRef
26.
Zurück zum Zitat Magda M, Martinez-Alvarez A, Cuenca-Asensi S (2017) Mooga parameter optimization for onset detection in emg signals. In: New Trends in Image Analysis and Processing–ICIAP 2017: ICIAP International Workshops, WBICV, SSPandBE, 3AS, RGBD, NIVAR, IWBAAS, and MADiMa 2017, Catania, Italy, September 11-15, 2017, Revised Selected Papers 19, pp. 171–180. Springer Magda M, Martinez-Alvarez A, Cuenca-Asensi S (2017) Mooga parameter optimization for onset detection in emg signals. In: New Trends in Image Analysis and Processing–ICIAP 2017: ICIAP International Workshops, WBICV, SSPandBE, 3AS, RGBD, NIVAR, IWBAAS, and MADiMa 2017, Catania, Italy, September 11-15, 2017, Revised Selected Papers 19, pp. 171–180. Springer
27.
Zurück zum Zitat Calisto MB, Lai-Yuen SK (2020) Adaen-net: an ensemble of adaptive 2d–3d fully convolutional networks for medical image segmentation. Neural Netw 126:76–94CrossRef Calisto MB, Lai-Yuen SK (2020) Adaen-net: an ensemble of adaptive 2d–3d fully convolutional networks for medical image segmentation. Neural Netw 126:76–94CrossRef
28.
Zurück zum Zitat Bubeck S, Cesa-Bianchi N et al (2012) Regret analysis of stochastic and nonstochastic multi-armed bandit problems. Foundat Trends Mach Learn 5(1):1–122CrossRef Bubeck S, Cesa-Bianchi N et al (2012) Regret analysis of stochastic and nonstochastic multi-armed bandit problems. Foundat Trends Mach Learn 5(1):1–122CrossRef
29.
Zurück zum Zitat Browne CB, Powley E, Whitehouse D, Lucas SM, Cowling PI, Rohlfshagen P, Tavener S, Perez D, Samothrakis S, Colton S (2012) A survey of monte carlo tree search methods. IEEE Trans Comput Intell AI Games 4(1):1–43CrossRef Browne CB, Powley E, Whitehouse D, Lucas SM, Cowling PI, Rohlfshagen P, Tavener S, Perez D, Samothrakis S, Colton S (2012) A survey of monte carlo tree search methods. IEEE Trans Comput Intell AI Games 4(1):1–43CrossRef
30.
Zurück zum Zitat Parsa M, Ankit A, Ziabari A, Roy K (2019) Pabo: Pseudo agent-based multi-objective bayesian hyperparameter optimization for efficient neural accelerator design. In: 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 1–8. IEEE Parsa M, Ankit A, Ziabari A, Roy K (2019) Pabo: Pseudo agent-based multi-objective bayesian hyperparameter optimization for efficient neural accelerator design. In: 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 1–8. IEEE
31.
Zurück zum Zitat Belakaria S, Deshwal A, Jayakodi NK, Doppa JR (2020) Uncertainty-aware search framework for multi-objective Bayesian optimization. Proc AAAI Conf Artif Intell 34:10044–10052 Belakaria S, Deshwal A, Jayakodi NK, Doppa JR (2020) Uncertainty-aware search framework for multi-objective Bayesian optimization. Proc AAAI Conf Artif Intell 34:10044–10052
32.
Zurück zum Zitat Srinivas N, Krause A, Kakade SM, Seeger M (2009) Gaussian process optimization in the bandit setting: No regret and experimental design. arXiv preprint arXiv:0912.3995 Srinivas N, Krause A, Kakade SM, Seeger M (2009) Gaussian process optimization in the bandit setting: No regret and experimental design. arXiv preprint arXiv:​0912.​3995
33.
Zurück zum Zitat Ansible Playbook. https://docs.ansible.com/ansible/latest/cli/ansible-playbook.html Ansible Playbook. https://​docs.​ansible.​com/​ansible/​latest/​cli/​ansible-playbook.​html
34.
Zurück zum Zitat LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324CrossRef
35.
Zurück zum Zitat Cats vs (2023) Dogs. https://www.kaggle.com/c/dogs-vs-cats. Accessed on Feb 20, Cats vs (2023) Dogs. https://​www.​kaggle.​com/​c/​dogs-vs-cats.​ Accessed on Feb 20,
36.
Zurück zum Zitat IMDB (2023) https://keras.io/api/datasets/imdb/. Accessed on Feb 20, IMDB (2023) https://​keras.​io/​api/​datasets/​imdb/​.​ Accessed on Feb 20,
37.
Zurück zum Zitat Turbostat (2023) https://www.mankier.com/8/turbostat. Accessed on Feb 20 Turbostat (2023) https://​www.​mankier.​com/​8/​turbostat.​ Accessed on Feb 20
39.
Zurück zum Zitat Shahriari B, Swersky K, Wang Z, Adams RP, De Freitas N (2015) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104(1):148–175CrossRef Shahriari B, Swersky K, Wang Z, Adams RP, De Freitas N (2015) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104(1):148–175CrossRef
40.
Zurück zum Zitat Bergstra J, Bengio Y (2012) Random search for hyper-parameter optimization. J Mach Learn Res 13(2) Bergstra J, Bengio Y (2012) Random search for hyper-parameter optimization. J Mach Learn Res 13(2)
41.
Zurück zum Zitat Karl F, Pielok T, Moosbauer J, Pfisterer F, Coors S, Binder M, Schneider L, Thomas J, Richter J, Lang M, et al (2022) Multi-objective hyperparameter optimization–an overview. arXiv preprint arXiv:2206.07438 Karl F, Pielok T, Moosbauer J, Pfisterer F, Coors S, Binder M, Schneider L, Thomas J, Richter J, Lang M, et al (2022) Multi-objective hyperparameter optimization–an overview. arXiv preprint arXiv:​2206.​07438
42.
Zurück zum Zitat Riquelme N, Von Lücken C, Baran B (2015) Performance metrics in multi-objective optimization. In: 2015 Latin American Computing Conference (CLEI), pp. 1–11. IEEE Riquelme N, Von Lücken C, Baran B (2015) Performance metrics in multi-objective optimization. In: 2015 Latin American Computing Conference (CLEI), pp. 1–11. IEEE
43.
Zurück zum Zitat McKnight PE, Najab J (2010) Mann-whitney u test. The Corsini encyclopedia of psychology, 1–1 McKnight PE, Najab J (2010) Mann-whitney u test. The Corsini encyclopedia of psychology, 1–1
44.
Zurück zum Zitat Cohen G, Afshar S, Tapson J, Van Schaik A (2017) Emnist: Extending mnist to handwritten letters. In: 2017 International Joint Conference on Neural Networks (IJCNN), pp. 2921–2926. IEEE Cohen G, Afshar S, Tapson J, Van Schaik A (2017) Emnist: Extending mnist to handwritten letters. In: 2017 International Joint Conference on Neural Networks (IJCNN), pp. 2921–2926. IEEE
45.
Zurück zum Zitat Montgomery DC (2017) Design and analysis of experiments. Wiley Montgomery DC (2017) Design and analysis of experiments. Wiley
46.
Zurück zum Zitat Hartikainen M, Miettinen K, Wiecek MM (2012) Paint: Pareto front interpolation for nonlinear multiobjective optimization. Comput Optimiz Appl 52(3):845–867MathSciNetCrossRef Hartikainen M, Miettinen K, Wiecek MM (2012) Paint: Pareto front interpolation for nonlinear multiobjective optimization. Comput Optimiz Appl 52(3):845–867MathSciNetCrossRef
47.
Zurück zum Zitat Knowles J (2006) Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans Evolut Comput 10(1):50–66CrossRef Knowles J (2006) Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans Evolut Comput 10(1):50–66CrossRef
48.
Zurück zum Zitat Zela A, Klein A, Falkne S, Hutter F (2018) Towards automated deep learning: Efficient joint neural architecture and hyperparameter search. arXiv preprint arXiv:1807.06906 Zela A, Klein A, Falkne S, Hutter F (2018) Towards automated deep learning: Efficient joint neural architecture and hyperparameter search. arXiv preprint arXiv:​1807.​06906
49.
Zurück zum Zitat Capra M, Bussolino B, Marchisio A, Masera G, Martina M, Shafique M (2020) Hardware and software optimizations for accelerating deep neural networks: survey of current trends, challenges, and the road ahead. IEEE Access 8:225134–225180CrossRef Capra M, Bussolino B, Marchisio A, Masera G, Martina M, Shafique M (2020) Hardware and software optimizations for accelerating deep neural networks: survey of current trends, challenges, and the road ahead. IEEE Access 8:225134–225180CrossRef
50.
Zurück zum Zitat Nabavinejad SM, Reda S (2021) Bayestuner: Leveraging Bayesian optimization for DNN inference configuration selection. IEEE Comput Arch Lett 20(2):166–170CrossRef Nabavinejad SM, Reda S (2021) Bayestuner: Leveraging Bayesian optimization for DNN inference configuration selection. IEEE Comput Arch Lett 20(2):166–170CrossRef
51.
Zurück zum Zitat Lokhmotov A, Chunosov N, Vella F, Fursin G (2018) Multi-objective autotuning of mobilenets across the full software/hardware stack. In: Proceedings of the 1st on Reproducible Quality-Efficient Systems Tournament on Co-designing Pareto-efficient Deep Learning, p. 1 Lokhmotov A, Chunosov N, Vella F, Fursin G (2018) Multi-objective autotuning of mobilenets across the full software/hardware stack. In: Proceedings of the 1st on Reproducible Quality-Efficient Systems Tournament on Co-designing Pareto-efficient Deep Learning, p. 1
Metadaten
Titel
HyperTuner: a cross-layer multi-objective hyperparameter auto-tuning framework for data analytic services
verfasst von
Hui Dou
Shanshan Zhu
Yiwen Zhang
Pengfei Chen
Zibin Zheng
Publikationsdatum
27.04.2024
Verlag
Springer US
Erschienen in
The Journal of Supercomputing
Print ISSN: 0920-8542
Elektronische ISSN: 1573-0484
DOI
https://doi.org/10.1007/s11227-024-06123-8

Premium Partner