Archive | Optimization

Simple Genetic Algorithm From Scratch in Python

Simple Genetic Algorithm From Scratch in Python

The genetic algorithm is a stochastic global optimization algorithm. It may be one of the most popular and widely known biologically inspired algorithms, along with artificial neural networks. The algorithm is a type of evolutionary algorithm and performs an optimization procedure inspired by the biological theory of evolution by means of natural selection with a […]

Continue Reading 14
Differential Evolution Global Optimization With Python

Differential Evolution Global Optimization With Python

Differential Evolution is a global optimization algorithm. It is a type of evolutionary algorithm and is related to other evolutionary algorithms such as the genetic algorithm. Unlike the genetic algorithm, it was specifically designed to operate upon vectors of real-valued numbers instead of bitstrings. Also unlike the genetic algorithm it uses vector operations like vector […]

Continue Reading 4
3D Surface Plot of the Ackley Multimodal Function

Evolution Strategies From Scratch in Python

Evolution strategies is a stochastic global optimization algorithm. It is an evolutionary algorithm related to others, such as the genetic algorithm, although it is designed specifically for continuous function optimization. In this tutorial, you will discover how to implement the evolution strategies optimization algorithm. After completing this tutorial, you will know: Evolution Strategies is a […]

Continue Reading 0
Line Plot of Metropolis Acceptance Criterion vs. Algorithm Iteration for Simulated Annealing

Simulated Annealing From Scratch in Python

Simulated Annealing is a stochastic global search optimization algorithm. This means that it makes use of randomness as part of the search process. This makes the algorithm appropriate for nonlinear objective functions where other local search algorithms do not operate well. Like the stochastic hill climbing local search algorithm, it modifies a single solution and […]

Continue Reading 14
No Free Lunch Theorem for Machine Learning

No Free Lunch Theorem for Machine Learning

The No Free Lunch Theorem is often thrown around in the field of optimization and machine learning, often with little understanding of what it means or implies. The theorem states that all optimization algorithms perform equally well when their performance is averaged across all possible problems. It implies that there is no single best optimization […]

Continue Reading 2
A Gentle Introduction to Stochastic Optimization Algorithms

A Gentle Introduction to Stochastic Optimization Algorithms

Stochastic optimization refers to the use of randomness in the objective function or in the optimization algorithm. Challenging optimization algorithms, such as high-dimensional nonlinear objective problems, may contain multiple local optima in which deterministic optimization algorithms may get stuck. Stochastic optimization algorithms provide an alternative approach that permits less optimal local decisions to be made […]

Continue Reading 2
How to Use Optimization Algorithms to Manually Fit Regression Models

How to Use Optimization Algorithms to Manually Fit Regression Models

Regression models are fit on training data using linear regression and local search optimization algorithms. Models like linear regression and logistic regression are trained by least squares optimization, and this is the most efficient approach to finding coefficients that minimize error for these models. Nevertheless, it is possible to use alternate optimization algorithms to fit […]

Continue Reading 0
Function Optimization With SciPy

Function Optimization With SciPy

Optimization involves finding the inputs to an objective function that result in the minimum or maximum output of the function. The open-source Python library for scientific computing called SciPy provides a suite of optimization algorithms. Many of the algorithms are used as a building block in other algorithms, most notably machine learning algorithms in the […]

Continue Reading 10
Plot of the Progress of Gradient Descent With Momentum on a One Dimensional Objective Function

Gradient Descent With Momentum from Scratch

Gradient descent is an optimization algorithm that follows the negative gradient of an objective function in order to locate the minimum of the function. A problem with gradient descent is that it can bounce around the search space on optimization problems that have large amounts of curvature or noisy gradients, and it can get stuck […]

Continue Reading 8
Local Optimization Versus Global Optimization

Local Optimization Versus Global Optimization

Optimization refers to finding the set of inputs to an objective function that results in the maximum or minimum output from the objective function. It is common to describe optimization problems in terms of local vs. global optimization. Similarly, it is also common to describe optimization algorithms or search algorithms in terms of local vs. […]

Continue Reading 2