Skip to main content
Erschienen in: Artificial Intelligence Review 3/2021

24.08.2020

A comparative analysis of gradient boosting algorithms

verfasst von: Candice Bentéjac, Anna Csörgő, Gonzalo Martínez-Muñoz

Erschienen in: Artificial Intelligence Review | Ausgabe 3/2021

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The family of gradient boosting algorithms has been recently extended with several interesting proposals (i.e. XGBoost, LightGBM and CatBoost) that focus on both speed and accuracy. XGBoost is a scalable ensemble technique that has demonstrated to be a reliable and efficient machine learning challenge solver. LightGBM is an accurate model focused on providing extremely fast training performance using selective sampling of high gradient instances. CatBoost modifies the computation of gradients to avoid the prediction shift in order to improve the accuracy of the model. This work proposes a practical analysis of how these novel variants of gradient boosting work in terms of training speed, generalization performance and hyper-parameter setup. In addition, a comprehensive comparison between XGBoost, LightGBM, CatBoost, random forests and gradient boosting has been performed using carefully tuned models as well as using their default settings. The results of this comparison indicate that CatBoost obtains the best results in generalization accuracy and AUC in the studied datasets although the differences are small. LightGBM is the fastest of all methods but not the most accurate. Finally, XGBoost places second both in accuracy and in training speed. Finally an extensive analysis of the effect of hyper-parameter tuning in XGBoost, LightGBM and CatBoost is carried out using two novel proposed tools.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Babajide Mustapha I, Saeed F (2016) Bioactive molecule prediction using extreme gradient boosting. Molecules 21(8):983CrossRef Babajide Mustapha I, Saeed F (2016) Bioactive molecule prediction using extreme gradient boosting. Molecules 21(8):983CrossRef
Zurück zum Zitat Breiman L, Friedman JH, Olshen RA, Stone CJ (1984) Classification and regression trees. Chapman & Hall, New YorkMATH Breiman L, Friedman JH, Olshen RA, Stone CJ (1984) Classification and regression trees. Chapman & Hall, New YorkMATH
Zurück zum Zitat Brown I, Mues C (2012) An experimental comparison of classification algorithms for imbalanced credit scoring data sets. Expert Syst Appl 39(3):3446–3453CrossRef Brown I, Mues C (2012) An experimental comparison of classification algorithms for imbalanced credit scoring data sets. Expert Syst Appl 39(3):3446–3453CrossRef
Zurück zum Zitat Caruana R, Niculescu-Mizil A (2006) An empirical comparison of supervised learning algorithms. In: Proceedings of the 23rd international conference on machine learning, ICML’06. ACM Press, New York, pp 161–168 Caruana R, Niculescu-Mizil A (2006) An empirical comparison of supervised learning algorithms. In: Proceedings of the 23rd international conference on machine learning, ICML’06. ACM Press, New York, pp 161–168
Zurück zum Zitat Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, KDD’16. ACM, New York, pp 785–794 Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, KDD’16. ACM, New York, pp 785–794
Zurück zum Zitat Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH
Zurück zum Zitat Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Maxh Learn 40(2):139–157CrossRef Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Maxh Learn 40(2):139–157CrossRef
Zurück zum Zitat Dwork C, Feldman V, Hardt M, Pitassi T, Reingold O, Roth A (2015) Generalization in adaptive data analysis and holdout reuse. Adv Neural Inf Process Syst 28:2350–2358MATH Dwork C, Feldman V, Hardt M, Pitassi T, Reingold O, Roth A (2015) Generalization in adaptive data analysis and holdout reuse. Adv Neural Inf Process Syst 28:2350–2358MATH
Zurück zum Zitat Fernández-Delgado M, Cernadas E, Barro S, Amorim D (2014) Do we need hundreds of classifiers to solve real world classification problems? J Mach Learn Res 15:3133–3181MathSciNetMATH Fernández-Delgado M, Cernadas E, Barro S, Amorim D (2014) Do we need hundreds of classifiers to solve real world classification problems? J Mach Learn Res 15:3133–3181MathSciNetMATH
Zurück zum Zitat Friedman JH (2002) Stochastic gradient boosting. Comput Stat Data Anal 38(4):367–378 Nonlinear Methods and Data MiningMathSciNetCrossRef Friedman JH (2002) Stochastic gradient boosting. Comput Stat Data Anal 38(4):367–378 Nonlinear Methods and Data MiningMathSciNetCrossRef
Zurück zum Zitat Gumus M, Kiran MS (2017) Crude oil price forecasting using xgboost. In: 2017 International conference on computer science and engineering (UBMK), pp 1100–1103 Gumus M, Kiran MS (2017) Crude oil price forecasting using xgboost. In: 2017 International conference on computer science and engineering (UBMK), pp 1100–1103
Zurück zum Zitat Ke G, Meng Q, Finley T, Wang T, Chen W, Ma W, Ye Q, Liu TY (2017) Lightgbm: a highly efficient gradient boosting decision tree. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, Garnett R (eds) Advances in neural information processing systems, vol 30, pp 3146–3154 Ke G, Meng Q, Finley T, Wang T, Chen W, Ma W, Ye Q, Liu TY (2017) Lightgbm: a highly efficient gradient boosting decision tree. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, Garnett R (eds) Advances in neural information processing systems, vol 30, pp 3146–3154
Zurück zum Zitat Khramtsov V, Sergeyev A, Spiniello C, Tortora C, Napolitano N, Agnello A, Getman F, De Jong J, Kuijken K, Radovich M, Shan H, Shulga V (2019) KiDS-SQuaD: II machine learning selection of bright extragalactic objects to search for new gravitationally lensed quasars. Astron Astrophys 632:A56CrossRef Khramtsov V, Sergeyev A, Spiniello C, Tortora C, Napolitano N, Agnello A, Getman F, De Jong J, Kuijken K, Radovich M, Shan H, Shulga V (2019) KiDS-SQuaD: II machine learning selection of bright extragalactic objects to search for new gravitationally lensed quasars. Astron Astrophys 632:A56CrossRef
Zurück zum Zitat Mirabal N, Charles E, Ferrara EC, Gonthier PL, Harding AK, Sánchez-Conde MA, Thompson DJ (2016) 3FGL demographics outside the galactic plane using supervised machine learning: pulsar and dark matter subhalo interpretations. Astrophys J 825(1):69CrossRef Mirabal N, Charles E, Ferrara EC, Gonthier PL, Harding AK, Sánchez-Conde MA, Thompson DJ (2016) 3FGL demographics outside the galactic plane using supervised machine learning: pulsar and dark matter subhalo interpretations. Astrophys J 825(1):69CrossRef
Zurück zum Zitat Nori V, Hane C, Crown W, Au R, Burke W, Sanghavi D, Bleicher P (2019) Machine learning models to predict onset of dementia: a label learning approach. Alzheimer’s Dementia Transl Res Clin Interven 5:918–925CrossRef Nori V, Hane C, Crown W, Au R, Burke W, Sanghavi D, Bleicher P (2019) Machine learning models to predict onset of dementia: a label learning approach. Alzheimer’s Dementia Transl Res Clin Interven 5:918–925CrossRef
Zurück zum Zitat Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E (2011) Scikit-learn: machine learning in python. J Mach Learn Res 12:2825–2830MathSciNetMATH Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E (2011) Scikit-learn: machine learning in python. J Mach Learn Res 12:2825–2830MathSciNetMATH
Zurück zum Zitat Prokhorenkova L, Gusev G, Vorobev A, Dorogush AV, Gulin A (2018) Catboost: unbiased boosting with categorical features. In: Bengio S, Wallach H, Larochelle H, Grauman K, Cesa-Bianchi N, Garnett R (eds) Advances in neural information processing systems, vol 31, pp 6638–6648 Prokhorenkova L, Gusev G, Vorobev A, Dorogush AV, Gulin A (2018) Catboost: unbiased boosting with categorical features. In: Bengio S, Wallach H, Larochelle H, Grauman K, Cesa-Bianchi N, Garnett R (eds) Advances in neural information processing systems, vol 31, pp 6638–6648
Zurück zum Zitat Rokach L (2016) Decision forest: twenty years of research. Inf Fusion 27:111–125CrossRef Rokach L (2016) Decision forest: twenty years of research. Inf Fusion 27:111–125CrossRef
Zurück zum Zitat Valdivia A, Luzón MV, Cambria E, Herrera F (2018) Consensus vote models for detecting and filtering neutrality in sentiment analysis. Inf Fusion 44:126–135CrossRef Valdivia A, Luzón MV, Cambria E, Herrera F (2018) Consensus vote models for detecting and filtering neutrality in sentiment analysis. Inf Fusion 44:126–135CrossRef
Zurück zum Zitat Xia Y, Liu C, Li Y, Liu N (2017) A boosted decision tree approach using bayesian hyper-parameter optimization for credit scoring. Expert Syst Appl 78:225–241CrossRef Xia Y, Liu C, Li Y, Liu N (2017) A boosted decision tree approach using bayesian hyper-parameter optimization for credit scoring. Expert Syst Appl 78:225–241CrossRef
Zurück zum Zitat Yoav Freund RES (1999) A short introduction to boosting. J Jpn Soc Artif Intell 14(5):771–780 Yoav Freund RES (1999) A short introduction to boosting. J Jpn Soc Artif Intell 14(5):771–780
Zurück zum Zitat Zhang C, Liu C, Zhang X, Almpanidis G (2017) An up-to-date comparison of state-of-the-art classification algorithms. Expert Syst Appl 82:128–150CrossRef Zhang C, Liu C, Zhang X, Almpanidis G (2017) An up-to-date comparison of state-of-the-art classification algorithms. Expert Syst Appl 82:128–150CrossRef
Metadaten
Titel
A comparative analysis of gradient boosting algorithms
verfasst von
Candice Bentéjac
Anna Csörgő
Gonzalo Martínez-Muñoz
Publikationsdatum
24.08.2020
Verlag
Springer Netherlands
Erschienen in
Artificial Intelligence Review / Ausgabe 3/2021
Print ISSN: 0269-2821
Elektronische ISSN: 1573-7462
DOI
https://doi.org/10.1007/s10462-020-09896-5

Weitere Artikel der Ausgabe 3/2021

Artificial Intelligence Review 3/2021 Zur Ausgabe

Premium Partner