SALE! Use code blackfriday for 40% off everything!
Hurry, sale ends soon! Click to see the full catalog.

Author Archive | Jason Brownlee

Why Optimization Is Important in Machine Learning

Why Optimization Is Important in Machine Learning

Machine learning involves using an algorithm to learn and generalize from historical data in order to make predictions on new data. This problem can be described as approximating a function that maps examples of inputs to examples of outputs. Approximating a function can be solved by framing the problem as function optimization. This is where […]

Continue Reading 11
A Gentle Introduction to Function Optimization

A Gentle Introduction to Function Optimization

Function optimization is a foundational area of study and the techniques are used in almost every quantitative field. Importantly, function optimization is central to almost all machine learning algorithms, and predictive modeling projects. As such, it is critical to understand what function optimization is, the terminology used in the field, and the elements that constitute […]

Continue Reading 2
Line Plot of Objective Function With Search Starting Point and Optima

Line Search Optimization With Python

The line search is an optimization algorithm that can be used for objective functions with one or more variables. It provides a way to use a univariate optimization algorithm, like a bisection search on a multivariate objective function, by using the search to locate the optimal step size in each dimension from a known point […]

Continue Reading 4
Contour Plot of the Test Objective Function With RMSProp Search Results Shown

Gradient Descent With RMSProp from Scratch

Gradient descent is an optimization algorithm that follows the negative gradient of an objective function in order to locate the minimum of the function. A limitation of gradient descent is that it uses the same step size (learning rate) for each input variable. AdaGrad, for short, is an extension of the gradient descent optimization algorithm […]

Continue Reading 7
3D Surface Plot of the Ackley Multimodal Function

Dual Annealing Optimization With Python

Dual Annealing is a stochastic global optimization algorithm. It is an implementation of the generalized simulated annealing algorithm, an extension of simulated annealing. In addition, it is paired with a local search algorithm that is automatically performed at the end of the simulated annealing procedure. This combination of effective global and local search procedures provides […]

Continue Reading 8
Essence of Bootstrap Aggregation Ensembles

Essence of Bootstrap Aggregation Ensembles

Bootstrap aggregation, or bagging, is a popular ensemble method that fits a decision tree on different bootstrap samples of the training dataset. It is simple to implement and effective on a wide range of problems, and importantly, modest extensions to the technique result in ensemble methods that are among some of the most powerful techniques, […]

Continue Reading 8
A Gentle Introduction to Ensemble Diversity for Machine Learning

A Gentle Introduction to Ensemble Diversity for Machine Learning

Ensemble learning combines the predictions from machine learning models for classification and regression. We pursue using ensemble methods to achieve improved predictive performance, and it is this improvement over any of the contributing models that defines whether an ensemble is good or not. A property that is present in a good ensemble is the diversity […]

Continue Reading 6
A Gentle Introduction to Multiple-Model Machine Learning

A Gentle Introduction to Multiple-Model Machine Learning

An ensemble learning method involves combining the predictions from multiple contributing models. Nevertheless, not all techniques that make use of multiple machine learning models are ensemble learning algorithms. It is common to divide a prediction problem into subproblems. For example, some problems naturally subdivide into independent but related subproblems and a machine learning model can […]

Continue Reading 6