How to Use Optimization Algorithms to Manually Fit Regression Models

How to Use Optimization Algorithms to Manually Fit Regression Models

Regression models are fit on training data using linear regression and local search optimization algorithms. Models like linear regression and logistic regression are trained by least squares optimization, and this is the most efficient approach to finding coefficients that minimize error for these models. Nevertheless, it is possible to use alternate optimization algorithms to fit […]

Continue Reading
Function Optimization With SciPy

Function Optimization With SciPy

Optimization involves finding the inputs to an objective function that result in the minimum or maximum output of the function. The open-source Python library for scientific computing called SciPy provides a suite of optimization algorithms. Many of the algorithms are used as a building block in other algorithms, most notably machine learning algorithms in the […]

Continue Reading
Plot of the Progress of Gradient Descent With Momentum on a One Dimensional Objective Function

Gradient Descent With Momentum from Scratch

Gradient descent is an optimization algorithm that follows the negative gradient of an objective function in order to locate the minimum of the function. A problem with gradient descent is that it can bounce around the search space on optimization problems that have large amounts of curvature or noisy gradients, and it can get stuck […]

Continue Reading
Plot of Range of He Weight Initialization With Inputs From One to One Hundred

Weight Initialization for Deep Learning Neural Networks

Weight initialization is an important design choice when developing deep learning neural network models. Historically, weight initialization involved using small random numbers, although over the last decade, more specific heuristics have been developed that use information, such as the type of activation function that is being used and the number of inputs to the node. […]

Continue Reading
Local Optimization Versus Global Optimization

Local Optimization Versus Global Optimization

Optimization refers to finding the set of inputs to an objective function that results in the maximum or minimum output from the objective function. It is common to describe optimization problems in terms of local vs. global optimization. Similarly, it is also common to describe optimization algorithms or search algorithms in terms of local vs. […]

Continue Reading
3D Surface Plot of the Ackley Multimodal Function

How to Use Nelder-Mead Optimization in Python

The Nelder-Mead optimization algorithm is a widely used approach for non-differentiable objective functions. As such, it is generally referred to as a pattern search algorithm and is used as a local or global search procedure, challenging nonlinear and potentially noisy and multimodal function optimization problems. In this tutorial, you will discover the Nelder-Mead optimization algorithm. […]

Continue Reading
Recommender Systems: An Introduction

How to Get Started With Recommender Systems

Recommender systems may be the most common type of predictive model that the average person may encounter. They provide the basis for recommendations on services such as Amazon, Spotify, and Youtube. Recommender systems are a huge daunting topic if you’re just getting started. There is a myriad of data preparation techniques, algorithms, and model evaluation […]

Continue Reading