Tag Archives | partial derivatives

neural_networks_cover

Calculus in Action: Neural Networks

An artificial neural network is a computational model that approximates a mapping between inputs and outputs.  It is inspired by the structure of the human brain, in that it is similarly composed of a network of interconnected neurons that propagate information upon receiving sets of stimuli from neighbouring neurons. Training a neural network involves a […]

Continue Reading
chain_rule_cover

The Chain Rule of Calculus for Univariate and Multivariate Functions

The chain rule allows us to find the derivative of composite functions. It is computed extensively by the backpropagation algorithm, in order to train feedforward neural networks. By applying the chain rule in an efficient manner while following a specific order of operations, the backpropagation algorithm calculates the error gradient of the loss function with […]

Continue Reading
jacobian_cover

A Gentle Introduction to the Jacobian

In the literature, the term Jacobian is often interchangeably used to refer to both the Jacobian matrix or its determinant.  Both the matrix and the determinant have useful and important applications: in machine learning, the Jacobian matrix aggregates the partial derivatives that are necessary for backpropagation; the determinant is useful in the process of changing […]

Continue Reading
Atif Gulzar

A Gentle Introduction To Partial Derivatives and Gradient Vectors

Partial derivatives and gradient vectors are used very often in machine learning algorithms for finding the minimum or maximum of a function. Gradient vectors are used in the training of neural networks, logistic regression, and many other classification and regression problems. In this tutorial, you will discover partial derivatives and the gradient vector. After completing […]

Continue Reading