Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden.
powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden.
powered by
Abstract
Ensemble models in machine learning involve combining predictions from multiple diverse models to achieve improved accuracy and stability. This chapter explores various ensemble techniques and their benefits.
The search for the best machine learning algorithm for a particular problem is an ongoing challenge. Studies have shown that no single algorithm performs best across all datasets. This has led to the concept of ensemble learning, where the predictions of multiple models are aggregated to produce a final estimate.
The effectiveness of combining diverse independent estimates was first highlighted in “The Wisdom of Crowds.” A classic example by Sir Francis Galton demonstrated the power of combining individual estimates, leading to a more accurate prediction.
Ensemble models are created using different approaches, such as employing multiple algorithms, varying model parameters, sampling different subsets of predictor variables, or sampling observations. The benefits of ensemble models lie in reduced variation and improved accuracy.
Reduced variation ensures reliability in predictions with different data samples, allowing for a better understanding of the model’s performance with unseen data. Improved accuracy is achieved by combining independent predictions, which helps cancel out errors, resulting in better overall predictions.
Bagging, Random Forests, AdaBoost, Gradient Tree Boosting, and XGBoost are discussed. These models are popular for their ability to handle different types of data and achieve state-of-the-art performance in various contexts.
The chapter includes practical examples of ensemble modeling with continuous and binary targets. One example uses a KNIME workflow to predict used car prices using ordinary least regression (OLS) and Gradient Boosted Trees. Another example involves predicting credit status using XGBoost.
Anzeige
Bitte loggen Sie sich ein, um Zugang zu Ihrer Lizenz zu erhalten.