SALE! Use code blackfriday for 40% off everything!
Hurry, sale ends soon! Click to see the full catalog.

Archive | Ensemble Learning

Why Use Ensemble Learning

Why Use Ensemble Learning?

What are the Benefits of Ensemble Methods for Machine Learning? Ensembles are predictive models that combine predictions from two or more other models. Ensemble learning methods are popular and the go-to technique when the best performance on a predictive modeling project is the most important outcome. Nevertheless, they are not always the most appropriate technique […]

Continue Reading 4
Ensemble Learning Pattern Classification Using Ensemble Methods

6 Books on Ensemble Learning

Ensemble learning involves combining the predictions from multiple machine learning models. The effect can be both improved predictive performance and lower variance of the predictions made by the model. Ensemble methods are covered in most textbooks on machine learning; nevertheless, there are books dedicated to the topic. In this post, you will discover the top […]

Continue Reading 0
Box Plot of Gradient Boosting Ensemble Tree Depth vs. Classification Accuracy

How to Develop a Gradient Boosting Machine Ensemble in Python

The Gradient Boosting Machine is a powerful ensemble machine learning algorithm that uses decision trees. Boosting is a general ensemble technique that involves sequentially adding models to the ensemble where subsequent models correct the performance of prior models. AdaBoost was the first algorithm to deliver on the promise of boosting. Gradient boosting is a generalization […]

Continue Reading 10
Box Plot of AdaBoost Ensemble Weak Learner Depth vs. Classification Accuracy

How to Develop an AdaBoost Ensemble in Python

Boosting is a class of ensemble machine learning algorithms that involve combining the predictions from many weak learners. A weak learner is a model that is very simple, although has some skill on the dataset. Boosting was a theoretical concept long before a practical algorithm could be developed, and the AdaBoost (adaptive boosting) algorithm was […]

Continue Reading 8
Box Plot of Random Subspace Ensemble Number of Features vs. Classification Accuracy

How to Develop a Bagging Ensemble with Python

Bagging is an ensemble machine learning algorithm that combines the predictions from many decision trees. It is also easy to implement given that it has few key hyperparameters and sensible heuristics for configuring these hyperparameters. Bagging performs well in general and provides the basis for a whole field of ensemble of decision tree algorithms such […]

Continue Reading 6
Box Plot of Soft Voting Ensemble Compared to Standalone Models for Binary Classification

How to Develop Voting Ensembles With Python

Voting is an ensemble machine learning algorithm. For regression, a voting ensemble involves making a prediction that is the average of multiple other regression models. In classification, a hard voting ensemble involves summing the votes for crisp class labels from other models and predicting the class with the most votes. A soft voting ensemble involves […]

Continue Reading 31
How to Use One-vs-Rest and One-vs-One for Multi-Class Classification

One-vs-Rest and One-vs-One for Multi-Class Classification

Not all classification predictive models support multi-class classification. Algorithms such as the Perceptron, Logistic Regression, and Support Vector Machines were designed for binary classification and do not natively support classification tasks with more than two classes. One approach for using binary classification algorithms for multi-classification problems is to split the multi-class classification dataset into multiple […]

Continue Reading 38